FURRYOS PROJECT MANIFEST
Generated: 2026-01-01 04:17:30.927717
============================================================

--- ISO SEARCH ---
NO ISO FILES FOUND.

--- BUILD LOG DIAGNOSIS (Last 20 lines) ---
build.log does not exist.

============================================================

--- FILE STRUCTURE & CRITICAL CONTENT ---
[DIR] FurryOS/
    ⚠️  WARNING: 'neofetch' found in build.sh
    build.sh
        --- CONTENT START ---
        | #!/bin/bash
        | # FurryOS Master Build Script (Debian 13 Trixie Base)
        | # Usage: sudo ./build.sh
        | 
        | set -e
        | 
        | # Configuration
        | OS_NAME="FurryOS"
        | CODENAME="trixie"
        | ARCH="amd64"
        | DATE=$(date +%Y%m%d)
        | WORK_DIR=$(pwd)
        | 
        | echo "🦊 Starting FurryOS Build [$DATE]..."
        | 
        | # 1. Check Root
        | if [ "$EUID" -ne 0 ]; then
        |   echo "❌ Please run as root (sudo ./build.sh)"
        |   exit
        | fi
        | 
        | # 2. Install Live Build Tools
        | echo "📦 Installing build dependencies..."
        | apt-get update
        | apt-get install -y live-build debootstrap squashfs-tools xorriso isolinux syslinux-utils
        | 
        | # 3. Clean Previous Build
        | echo "🧹 Cleaning previous builds..."
        | lb clean
        | 
        | # 4. Configure the Build
        | echo "⚙️ Configuring live-build for Debian $CODENAME..."
        | lb config \
        |     --distribution $CODENAME \
        |     --architecture $ARCH \
        |     --archive-areas "main contrib non-free-firmware" \
        |     --security true \
        |     --updates true \
        |     --bootappend-live "boot=live components quiet splash hostname=furryos persistence" \
        |     --linux-packages "linux-image linux-headers" \
        |     --iso-volume "$OS_NAME Live $DATE" \
        |     --iso-application "$OS_NAME" \
        |     --memtest none
        | 
        | # 5. Inject Custom Configs & Assets
        | echo "🎨 Injecting configuration and assets..."
        | 
        | # Ensure config directories exist
        | mkdir -p config/includes.chroot/usr/share/backgrounds/furryos/
        | mkdir -p config/hooks/live/
        | mkdir -p config/package-lists/
        | 
        | # Copy your assets (wallpapers, logos)
        | if [ -d "assets" ]; then
        |     cp -r assets/* config/includes.chroot/usr/share/backgrounds/furryos/
        | fi
        | 
        | # Create the branding hook
        | cat << 'HOOK' > config/hooks/live/01-furryos-branding.hook.chroot
        | #!/bin/sh
        | echo "🦊 Hook: Applying FurryOS Identity..."
        | # Update OS Release
        | sed -i 's/PRETTY_NAME=.*/PRETTY_NAME="FurryOS (Rolling)"/g' /etc/os-release
        | sed -i 's/NAME="Debian GNU\/Linux"/NAME="FurryOS"/g' /etc/os-release
        | # Set Hostname
        | echo "furryos" > /etc/hostname
        | echo "127.0.1.1 furryos" >> /etc/hosts
        | HOOK
        | chmod +x config/hooks/live/01-furryos-branding.hook.chroot
        | 
        | # Create the package list
        | cat << 'PKG' > config/package-lists/desktop.list.chroot
        | task-gnome-desktop
        | firmware-linux
        | firmware-iwlwifi
        | firmware-misc-nonfree
        | neofetch
        | htop
        | curl
        | git
        | calamares
        | calamares-settings-debian
        | plymouth
        | plymouth-themes
        | PKG
        | 
        | # 6. Build the ISO
        | echo "🚀 Building ISO... This will take a while."
        | lb build
        | 
        | echo "✅ Build Complete! Your ISO is in this directory."
        --- CONTENT END ---
    manifest.txt
    ⚠️  WARNING: 'neofetch' found in build_log.txt
    build_log.txt
    ⚠️  WARNING: 'neofetch' found in generate_manifest.py
    generate_manifest.py
        --- CONTENT START ---
        | import os
        | import datetime
        | 
        | ROOT_DIR = os.getcwd()
        | OUTPUT_FILE = "manifest.txt"
        | IGNORE_DIRS = {'.git', '.build', 'chroot', 'binary', 'cache', '__pycache__', 'local'}
        | CRITICAL_EXTENSIONS = {'.sh', '.yaml', '.list', '.chroot', '.hook', '.py'}
        | 
        | def get_file_content(filepath):
        |     """Reads file content for debugging."""
        |     try:
        |         with open(filepath, 'r', encoding='utf-8', errors='ignore') as f:
        |             return f.read()
        |     except Exception as e:
        |         return f"[Error reading file: {e}]"
        | 
        | def scan_project():
        |     print(f"🕵️  Scanning FurryOS Project Root: {ROOT_DIR}")
        | 
        |     with open(OUTPUT_FILE, 'w', encoding='utf-8') as report:
        |         report.write(f"FURRYOS PROJECT MANIFEST\n")
        |         report.write(f"Generated: {datetime.datetime.now()}\n")
        |         report.write("="*60 + "\n\n")
        | 
        |         # 1. CHECK FOR ISO FILES
        |         report.write("--- ISO SEARCH ---\n")
        |         iso_found = False
        |         for root, dirs, files in os.walk(ROOT_DIR):
        |             for file in files:
        |                 if file.endswith(".iso"):
        |                     iso_path = os.path.join(root, file)
        |                     size_mb = os.path.getsize(iso_path) / (1024 * 1024)
        |                     report.write(f"[FOUND ISO] {iso_path} ({size_mb:.2f} MB)\n")
        |                     iso_found = True
        |         if not iso_found:
        |             report.write("NO ISO FILES FOUND.\n")
        |         report.write("\n")
        | 
        |         # 2. CHECK BUILD LOG TAIL
        |         report.write("--- BUILD LOG DIAGNOSIS (Last 20 lines) ---\n")
        |         log_file = os.path.join(ROOT_DIR, 'build.log')
        |         if os.path.exists(log_file):
        |             try:
        |                 with open(log_file, 'r', encoding='utf-8', errors='ignore') as f:
        |                     lines = f.readlines()
        |                     tail = lines[-20:] if len(lines) > 20 else lines
        |                     report.write("".join(tail))
        |             except:
        |                 report.write("Could not read build.log\n")
        |         else:
        |             report.write("build.log does not exist.\n")
        |         report.write("\n" + "="*60 + "\n\n")
        | 
        |         # 3. FULL FILE TREE & CONTENT
        |         report.write("--- FILE STRUCTURE & CRITICAL CONTENT ---\n")
        | 
        |         neofetch_count = 0
        | 
        |         for root, dirs, files in os.walk(ROOT_DIR):
        |             # Filter directories
        |             dirs[:] = [d for d in dirs if d not in IGNORE_DIRS]
        | 
        |             level = root.replace(ROOT_DIR, '').count(os.sep)
        |             indent = ' ' * 4 * level
        |             report.write(f"{indent}[DIR] {os.path.basename(root)}/\n")
        | 
        |             subindent = ' ' * 4 * (level + 1)
        |             for file in files:
        |                 file_path = os.path.join(root, file)
        |                 # Check for neofetch ghost
        |                 try:
        |                     with open(file_path, 'r', errors='ignore') as f:
        |                         if "neofetch" in f.read():
        |                             report.write(f"{subindent}⚠️  WARNING: 'neofetch' found in {file}\n")
        |                             neofetch_count += 1
        |                 except: pass
        | 
        |                 report.write(f"{subindent}{file}\n")
        | 
        |                 # If it's a critical config file, dump content
        |                 _, ext = os.path.splitext(file)
        |                 if ext in CRITICAL_EXTENSIONS or file == 'build.sh':
        |                     content = get_file_content(file_path)
        |                     report.write(f"{subindent}    --- CONTENT START ---\n")
        |                     # Indent content
        |                     for line in content.splitlines():
        |                         report.write(f"{subindent}    | {line}\n")
        |                     report.write(f"{subindent}    --- CONTENT END ---\n")
        | 
        |         report.write("\n" + "="*60 + "\n")
        |         report.write(f"DIAGNOSTIC SUMMARY:\n")
        |         if neofetch_count > 0:
        |             report.write(f"❌ CRITICAL: Found {neofetch_count} files still containing 'neofetch'. This WILL break the build.\n")
        |         else:
        |             report.write(f"✅ CLEAN: No 'neofetch' found.\n")
        | 
        |     print(f"✅ Manifest generated at: {os.path.join(ROOT_DIR, OUTPUT_FILE)}")
        | 
        | if __name__ == "__main__":
        |     scan_project()
        --- CONTENT END ---
    [DIR] _OLD_BACKUPS/
        [DIR] ARCHIVE_20251231_201946/
        [DIR] archive_backups/
            organize_project.py.bak
        [DIR] ARCHIVE_20251231/
            TIMESTAMPER.py.bak
            quick_start.sh.bak
            launcher.py.bak
    [DIR] build_artifacts/
        [DIR] build_temp/
        [DIR] output/
        [DIR] logs/
    [DIR] config/
        chroot
        USER_CONFIG.yaml
            --- CONTENT START ---
            | user_profile:
            |   username: anthro_user
            |   fullname: Anthro Architect
            |   timezone: America/Chicago
            |   shell: /bin/zsh # Supercharged: Modernized default shell to Zsh
            | 
            | build_preference:
            |   target_hardware: generic_x86_64
            |   selected_kernel: 6.12-mainline
            |   experience_level: intermediate # Options: beginner, intermediate, advanced, paranoid
            | 
            | hardware_overrides:
            |   force_nvidia_drivers: false
            |   enable_wifi_proprietary: true
            |   display_server: wayland # Supercharged: Explicitly prefer Wayland for modern display management
            |   fractional_scaling: true # Supercharged: Enable fractional scaling for HiDPI displays
            |   hdr_support: true # Supercharged: Enable HDR support where available
            |   variable_refresh_rate: true # Supercharged: Enable VRR (FreeSync/G-Sync compatible)
            | 
            | software_bundles:
            |   gaming: true # Steam, Lutris
            |   creative: false # Blender, GIMP
            |   dev_tools: true # VSCode, Git, Python
            |   office: false # LibreOffice
            | 
            | security:
            |   firewall_enabled: true
            |   block_ads_dns: true
            |   kill_telemetry: true
            |   disk_encryption: luks # Supercharged: Added full disk encryption for enhanced security
            |   apparmor_enabled: true # Supercharged: Added AppArmor for mandatory access control
            | 
            | persistence:
            |   create_partition: true
            |   size_mb: 4096
            |   filesystem_type: btrfs # Supercharged: Modern default filesystem
            |   btrfs_options: # Supercharged: Btrfs specific features
            |     snapshots_enabled: true # Automatic snapshots for rollback
            |     compression: zstd # Transparent compression for performance and space
            |     subvolume_layout: standard # Recommended subvolume structure (/, /home, etc.)
            | 
            | system_optimizations: # Supercharged: New section for core system enhancements
            |   zram_enabled: true # Enable ZRAM for improved memory management
            |   zram_ratio: 0.5 # Allocate 50% of RAM to ZRAM
            |   bootloader_type: systemd-boot # Modern UEFI-native bootloader
            |   dns_resolver: systemd-resolved # Centralized, modern DNS resolution
            |   ntp_client: chrony # Precision NTP client for accurate time synchronization
            |   audio_server: pipewire # Supercharged: Modern audio server for low-latency and advanced features
            | 
            | desktop_experience: # Supercharged: New section for UI/UX enhancements
            |   default_desktop_environment: gnome # Common, well-integrated Wayland DE
            |   gtk_theme: adw-gtk3 # Modern GTK theme for a consistent look
            |   icon_theme: adwaita # Standard Adwaita icon theme
            |   cursor_theme: adwaita # Standard Adwaita cursor theme
            |   font_rendering: subpixel_rgb # Optimize font rendering for LCD screens
            |   font_hinting: full # Enable full font hinting for crisp text
            | 
            | software_distribution: # Supercharged: New section for software delivery
            |   flatpak_enabled: true # Enable Flatpak for sandboxed applications
            |   snap_enabled: false # Explicitly disable Snap (can be toggled if needed)
            |   container_runtime: podman # Modern, daemonless container runtime
            | 
            | power_management: # Supercharged: New section for power efficiency
            |   tlp_enabled: true # Enable TLP for advanced power management on laptops
            --- CONTENT END ---
        common
        source
        binary
        bootstrap
        GENOME.yaml.original
        GENOME.yaml
            --- CONTENT START ---
            | meta:
            |   framework_name: furryos genome
            |   codename: sovereign universe
            |   version: 8.1.0
            |   initial: gemini-3-pro-via-api-key
            |   revision: claude-4.5-sonnet-via-perplexity-pro
            |   timestamp: 2025-12-30 03:47:53 UTC
            |   author: thomas b sweet (anthro teacher)
            |   owner: anthro entertainment llc
            |   license: mit
            |   provenance:
            |     blockchain_anchor: bitcoin block 929481
            |     asset_source: anthroheart.com
            |     domains:
            |       - furry-os.com
            |       - furry-os.org
            |       - anthroheart.com
            |     repository: https://github.com/anthroheart/furryos
            |   philosophy: minimal live installer, maximum user choice, bleeding-edge stability, user empowerment
            |   key_features:
            |     - Wayland as preferred display server
            |     - Enhanced Btrfs integration with automatic snapshots and rollback
            |     - Optimized ZRAM with ZSTD compression
            |     - Comprehensive dynamic theming capabilities
            |     - Default PipeWire audio server
            |     - Integrated Flatpak support for universal applications
            |     - Secure Boot and TPM 2.0 integration for enhanced security
            |     - Systemd-homed for secure and portable user home directories
            |     - Advanced power management with TLP
            |     - Modern kernel features (PDS scheduler, BPF)
            |     - Optional immutable-like root filesystem resilience
            |     - Atomic and transactional system updates with robust rollback
            | live_environment:
            |   description: boots into live mode with visual indicator
            |   visual_indicator:
            |     border: animated pulsing border around entire screen
            |     color: "#FF6B35"
            |     width: 8px
            |     animation: pulse 2s infinite
            |     message: "\U0001F43E LIVE MODE - NOT INSTALLED YET \U0001F43E"
            |     position: top center, always visible
            |     dismiss: false
            |   capabilities:
            |     - test hardware compatibility
            |     - preview desktop environment (Wayland preferred, X11 fallback)
            |     - connect to wifi
            |     - browse web
            |     - access installer wizard
            |     - system diagnostics and repair tools
            |   persistence: false
            |   ram_usage: 512MB minimum, 2GB recommended
            | installer:
            |   type: hybrid live-net
            |   target_size:
            |     live_core: 1.2GB (squashfs with MATE)
            |     net_installer: 300MB (minimal kernel only)
            |   strategy:
            |     offline: installs from USB stick (fast, no internet needed)
            |     online: downloads latest packages during install (slower, but up-to-date)
            |   size: 300MB ISO (minimal kernel + assets)
            |   wizard:
            |     step1_welcome:
            |       ask_experience: true
            |       levels:
            |         beginner: granny mode - automatic everything
            |         intermediate: gamer mode - guided with choices
            |         advanced: hacker mode - full control, includes advanced Btrfs/Wayland/Networking options
            |         paranoid: ghost mode - privacy first, immutable root option available, enhanced security
            |     step2_hardware:
            |       auto_detect:
            |         - cpu
            |         - gpu
            |         - ram
            |         - storage
            |         - wifi
            |         - tpm_chip # Detect presence of TPM 2.0 chip
            |         - secure_boot_status # Detect if Secure Boot is enabled/supported
            |         - fwupd_support # Detect if system hardware supports fwupd for firmware updates
            |       ask_proprietary:
            |         nvidia: install cuda drivers? (with Wayland compatibility layers, GL/Vulkan support)
            |         amd: install rocm drivers? (for GPU compute)
            |         wifi: install firmware?
            |     step3_storage:
            |       disk_selection: graphical partition editor
            |       filesystem_options:
            |         ext4: default - stable, journaled (recommended)
            |         btrfs:
            |           description: advanced - snapshots, compression, subvolumes, send/receive for robust system management
            |           features: [snapshots, compression, subvolumes, send/receive, copy-on-write, data_integrity_checksums]
            |           subvolume_layout: "@ @home @var @opt @srv @cache @log @tmp @swap" # Standard layout for root, home, and other system directories
            |           mount_options: "compress=zstd:3,ssd,noatime,space_cache=v2" # Recommended mount options for performance and efficiency
            |           automatic_snapshots:
            |             enable: true
            |             frequency: daily, pre-update, pre-boot, pre-kernel-upgrade # Automatic snapshots for system resilience
            |             tool: snapper/btrfs-assistant # Tools used for managing snapshots
            |           snapshot_boot_support: true # Ability to boot into previous system snapshots via GRUB
            |           rollback_on_failure: true # Automated rollback if system update or boot fails
            |           maintenance_tasks:
            |             scrub: monthly # Automatic data integrity check
            |             balance: quarterly # Rebalance data across disks (if multi-device) or optimize allocation
            |             defrag: optional # On-demand defragmentation for specific files/directories
            |         zfs: enterprise - raid, deduplication (external modules, for advanced users)
            |         xfs: performance - large files, databases
            |         f2fs: flash - ssd/nvme optimized
            |         ntfs: compatibility - windows dual boot
            |       encryption:
            |         enable: optional
            |         method: luks2 aes-256-xts
            |         recovery_key: generate and display
            |         tpm_unlock_support: optional # Use TPM 2.0 for automatic LUKS unlock and integrity verification
            |         yubikey_support: optional # FIDO2/U2F support for LUKS unlock
            |       root_filesystem_strategy: # Option for an immutable-like root
            |         default: mutable_read_write
            |         advanced_options:
            |           immutable_root:
            |             enable: false # Default off, but configurable for advanced/paranoid users
            |             description: Read-only root with stateful overlayfs for system resilience and security
            |             details: 'Requires Btrfs and overlayfs; user changes persist in overlay, system files are immutable for security and easy rollback. Updates are transactional via Btrfs snapshots and atomic updates.'
            |             available_for: [advanced, paranoid]
            |             update_method: atomic_with_rollback # Ensures system integrity during updates
            |     step4_packages:
            |       base_system:
            |         components: [minimal kernel, systemd, pipewire, flatpak, systemd-resolved, fwupd] # PipeWire as default audio, Flatpak for app distribution
            |         always_installed: true
            |       desktop:
            |         none: server headless
            |         mate: recommended - lightweight, stable (X11 default, Wayland optional session)
            |         gnome: modern - touch friendly (Wayland preferred, X11 fallback session)
            |         xfce: minimal - low resources (X11 only)
            |         kde: feature rich - customizable (Wayland preferred, X11 fallback session)
            |         sway: tiling manager (Wayland native, for advanced users, minimal resource)
            |         hyprland: dynamic tiling manager (Wayland native, GPU-accelerated, for advanced users)
            |       bundles:
            |         gaming:
            |           - steam
            |           - lutris
            |           - wine
            |           - proton
            |           - openrgb
            |           - gamemode # Optimize system for gaming performance
            |           - mangohud # In-game performance overlay
            |         development:
            |           - vscode
            |           - git
            |           - docker
            |           - python
            |           - gcc
            |           - nodejs
            |           - podman # Alternative container runtime for rootless containers
            |           - nix # Nix package manager for reproducible builds and environments
            |           - distrobox # Create containerized developer environments
            |           - devcontainers_support # Integration for VS Code Dev Containers
            |         multimedia:
            |           - gimp
            |           - blender
            |           - audacity
            |           - kdenlive
            |           - obs
            |           - davinci_resolve_free # Professional video editing (if Debian compatible)
            |           - shotcut
            |         office:
            |           - libreoffice
            |           - thunderbird
            |           - pdf-tools
            |         pentesting:
            |           - nmap
            |           - wireshark
            |           - metasploit
            |           - burpsuite
            |         server:
            |           - nginx
            |           - mariadb
            |           - php
            |           - docker
            |           - fail2ban
            |           - cockpit # Web-based interface for server administration
            |           - cloud_init_tools # For cloud deployments and initial setup
            |       post_install: package manager always available
            |     step5_network:
            |       hostname: ask user or generate furry-{random}
            |       domain: furry.local
            |       wifi_setup: scan and connect during install
            |       firewall: enable ufw by default (with a secure baseline profile)
            |       vpn_setup: # Integrated VPN client setup
            |         enable: optional
            |         protocols: [wireguard, openvpn, ikev2]
            |         client_tools: [network-manager-wireguard, openvpn, strongswan]
            |         dns_privacy:
            |           - dns_over_tls # Configure systemd-resolved for DoT
            |           - dns_over_https # Configure systemd-resolved for DoH
            |     step6_users:
            |       root: locked - console only
            |       admin_user: sudo access, password required
            |       standard_users: optional additional accounts
            |       guest_mode: enable ephemeral guest account?
            |       home_directory_encryption: # Option for systemd-homed encrypted home directories
            |         enable: optional
            |         method: systemd-homed (encrypted, portable, and snapshot-aware home directories)
            |         available_for: [intermediate, advanced, paranoid]
            |         fido2_passkey_support: optional # Enable FIDO2/Passkey authentication for systemd-homed
            |   download_packages:
            |     method: parallel downloads from debian mirror
            |     fallback_mirrors:
            |       - deb.debian.org
            |       - ftp.us.debian.org
            |       - ftp.uk.debian.org
            |     cache: save to /var/cache/apt for offline reinstall
            | taxonomy:
            |   kingdom:
            |     desktop: full gui, mate desktop (X11/Wayland support)
            |     server_full: gui + tui dashboard
            |     server_headless: pure tui, 150mb ram
            |     embedded: raspberry pi / iot
            |     live_usb: portable, no persistence
            |     immutable_desktop: read-only root, atomic updates, Btrfs snapshots, robust rollback capabilities
            |   phylum:
            |     base_distro: debian
            |     release: bookworm 12
            |     kernel:
            |       source: mainline linux kernel
            |       version: 6.12+
            |       size: minimal - only essential drivers
            |       custom_patches:
            |         - zram
            |         - realtime-audio
            |         - pds_scheduler # Process Distribution Scheduler for improved responsiveness
            |         - bpf_runtime_enhancements # Enhanced BPF for network and security
            |         - low_latency_optimizations # General kernel tuning for desktop responsiveness
            |       firmware: downloaded during install if needed (via fwupd)
            |     bootloader: grub2 (universal compatibility, grub2-efi-signed for Secure Boot compatibility)
            |     bootloader_alternatives:
            |       systemd_boot: optional (for UEFI systems, integrates well with Btrfs snapshots and atomic updates)
            |   class:
            |     x86_64: amd64 primary target
            |     aarch64: raspberry pi 4/5
            |     riscv64: future proof
            |   order:
            |     granny: maximum ease, automatic updates
            |     gamer: performance first (with gamemode and optimal drivers), gaming-specific optimizations
            |     hacker: development tools, full control, advanced system options, containerization focus
            |     ghost: privacy paranoid, immutable by default, enhanced security, network hardening
            |   family:
            |     network:
            |       dns: systemd-resolved
            |       firewall: ufw
            |       ad_blocking: optional post-install (system-wide via AdGuard Home/Pi-hole integration)
            |       network_manager: networkmanager (with support for advanced configurations)
            |     security:
            |       encryption: luks2
            |       keygen: ed25519
            |       secure_boot_support: full # Comprehensive support for UEFI Secure Boot
            |       tpm_integration: optional_luks_unlock_and_integrity_check # TPM for LUKS unlock and system integrity verification
            |       apparmor_profile: default_enforcing # AppArmor enabled by default with a secure profile
            |       kernel_hardening: enabled # Default kernel hardening features
            |       user_auth_methods: [password, fido2, tpm_pin] # Support for multiple authentication methods
            |     ui:
            |       display_server: wayland (preferred), x11 (fallback)
            |       wayland_compositors: [gnome-shell, kwin, sway, hyprland] # Pre-configured Wayland compositors
            |       theme:
            |         name: furryos-midnight (dark)
            |         dynamic_accent_color: true # User-configurable dynamic accent colors
            |         light_dark_mode_switching: auto_or_manual # Automatic switching based on time/location or manual toggle
            |         user_customization: comprehensive # Extensive theming options for all desktop components (GTK, Qt, Shell)
            |         icon_theme: furryos-icons-vector # Vector-based icons for scalability
            |         cursor_theme: furryos-cursors
            |         gtk_theme_engine: adwaita-qt/kvantum (for consistent look)
            |         qt_theme_engine: adwaita-qt/kvantum (for consistent look)
            |       fonts: liberation sans, noto, nerd-fonts (for power users and development), font_rendering_config (subpixel, hinting)
            |       boot_animation: plymouth # Themed boot animation
            |     storage:
            |       filesystem: user choice (Btrfs recommended for advanced features like snapshots and resilience)
            |       swap:
            |         method: zram (auto-sized with systemd-zram-generator)
            |         auto_size_ratio: 0.5 # 50% of RAM
            |         max_size_gb: 16 # Cap ZRAM size to prevent excessive memory usage
            |         compression_algorithm: zstd # Faster and more efficient compression for ZRAM
            |         priority: 100 # High priority for zram swap
            |     audio_server: pipewire # Default and fully configured PipeWire for modern audio management
            |     power_management:
            |       tool: tlp (default) # TLP for optimized power savings on laptops
            |       options: [auto-tune, laptop-mode-tools, powertop (optional for advanced analysis)]
            |       earlyoom: enabled # Prevents system freezes during OOM situations
            |     app_distribution:
            |       flatpak: default (integrated into GUI and CLI package managers, with xdg-desktop-portal support)
            |       snap: optional (user choice during install or post-install)
            |       appimage: integrated (desktop file generation and execution permissions)
            |       distrobox: pre-installed for containerized development environments
            |     user_management:
            |       systemd_homed: optional (secure, portable, encrypted home directories)
            |   genus:
            |     modules:
            |       heartbeat: system orchestrator
            |       healer: watchdog service
            |       vault: encryption manager
            |       network_guardian: firewall + ad block
            |       remote_paw: ssh + rdp manager
            |       metadata_wrangler: media file tagger
            |       update_manager: # Dedicated update manager for transactional updates
            |         description: handles atomic updates, Btrfs snapshots, and rollbacks for system stability using systemd-boot/grub-btrfs/snapper
            |         type: offline/transactional (e.g., based on systemd-boot/grub-btrfs/snapper)
            |         notification_system: desktop_alerts, system_tray_icon
            | build:
            |   iso_type: hybrid (bios + uefi)
            |   bootloader: grub2
            |   compression: xz -9
            |   base_iso:
            |     auto_download: true
            |     url: https://cdimage.debian.org/debian-cd/current-live/amd64/iso-hybrid/debian-live-13.2.0-amd64-mate.iso
            |     checksum_url: https://cdimage.debian.org/debian-cd/current-live/amd64/iso-hybrid/SHA256SUMS
            |     verify: true
            |   included_assets:
            |     splash_screens: /furryos/assets/splash/*.png
            |     icons: /furryos/assets/icons/*.svg
            |     sounds: /furryos/assets/sounds/*.ogg
            |     wallpapers: /furryos/assets/wallpapers/*.jpg
            |     fonts: /furryos/assets/fonts/*.ttf
            |     wayland_compositors_config: /furryos/assets/wayland/*.conf # Configuration files for Wayland compositors
            |     xdg_portal_config: /furryos/assets/xdg-portal/*.conf # Configuration for XDG desktop portals
            |   output:
            |     name: furryos-{version}-{arch}.iso
            |     size_target: 300MB
            |     bootable_methods:
            |       - usb-dd
            |       - rufus
            |       - etcher
            |       - ventoy
            |       - secure-boot-uefi # Explicit support for booting with UEFI Secure Boot enabled
            |   compiler:
            |     cpp: g++
            |     standard: c++20
            |     flags: -O3 -flto -Wall -pthread
            |     linker: -lssl -lcrypto -lsqlite3
            |   python:
            |     version: 3.12+
            |     remove_externally_managed: true
            |     packages:
            |       - pyyaml
            |       - requests
            |       - pillow
            |       - mutagen
            | post_install:
            |   package_manager:
            |     gui: furryos package browser
            |     cli: apt
            |     features:
            |       - search by category
            |       - one click install
            |       - dependency resolution
            |       - automatic updates (optional, with transactional safeguards)
            |       - flatpak_integration: true # Seamless management of Flatpak applications
            |       - snap_integration: optional # Optional management of Snap applications
            |       - appimage_management: true # Integrated management for AppImage applications (e.g., desktop entry generation)
            |       - btrfs_assistant_integration: true # GUI for Btrfs snapshot management
            |       - theming_tool: furryos-theme-manager # GUI for comprehensive system theming
            |   asset_downloader:
            |     anthroheart_pack:
            |       url: https://anthroheart.com/assets/The_AnthroHeart_Collection_Bundle.7z
            |       size: 9GB
            |       optional: true
            |       description: blockchain verified media library
            |     furryos_pack:
            |       url: https://anthroheart.com/assets/FurryOS.7z
            |       size: 6MB
            |       description: blockchain verified debian 13 based operating system
            |   firmware_updater: fwupd # Integrated tool for updating system firmware
            | pain_points:
            |   python_externally_managed: removed on install
            |   boot_issues: grub auto-repair + fallback (with Btrfs snapshot boot option for recovery, and systemd-boot for advanced UEFI users)
            |   wifi_drivers: firmware-iwlwifi, firmware-realtek included (and automatic detection of other needed firmware)
            |   nvidia_pain: auto-detect, offer driver choice (with full Wayland compatibility considerations, G/Vulkan support)
            |   sound_issues: pipewire default (full-featured, low-latency setup with robust hardware support and easy device switching)
            |   no_trailing_slash: filesystem enforced
            |   no_spaces_filenames: auto convert to underscores
            |   auto_resize_wallpaper: desktop wallpaper scales right at first
            |   wayland_app_compatibility:
            |     description: Some legacy X11 applications may require XWayland; ensure smooth integration
            |     resolution: XWayland enabled by default, clear user guidance and recommended native Wayland apps, robust xdg-desktop-portal implementation
            |   atomic_update_resilience:
            |     description: Handling of partial updates or power loss during critical system updates
            |     resolution: Btrfs snapshots and transactional updates (e.g., via `apt-btrfs-snapshot` or `snapper`) mitigate risks and enable easy rollbacks
            |   accessibility:
            |     description: Ensuring the OS is usable for individuals with diverse needs
            |     resolution: Pre-installed screen readers (Orca), high contrast themes, scalable UI elements, and keyboard navigation support
            --- CONTENT END ---
        USER_CONFIG.yaml.original
        [DIR] bootloaders/
        [DIR] includes.chroot_before_packages/
        [DIR] apt/
        [DIR] includes.chroot_after_packages/
        [DIR] includes.source/
        [DIR] archives/
        [DIR] packages.binary/
        [DIR] includes.chroot/
            [DIR] etc/
                [DIR] skel/
                    [DIR] Desktop/
                        DOWNLOAD_ANTHROHEART_PACK.txt
                [DIR] xdg/
                    [DIR] autostart/
                        furryos-startup-sound.desktop
            [DIR] usr/
                [DIR] share/
                    [DIR] backgrounds/
                        [DIR] furryos/
                            heartbeat_core.c
                            Makefile_optimized
                            icon.png
                            heartbeat_core_asm.s
                            computer.png
                            AnthroHeart_Trinity.png
                            wallpaper.png
                            Gemini_API.key.txt
                            Cio as Anthro.png
                            healer_core.cpp
                            AnthroHeart Trinity.png
                            [DIR] icons/
                            [DIR] wallpapers/
                            [DIR] images/
                            [DIR] splash/
                            [DIR] sounds/
                            [DIR] Original 7z Blockchain Receipts/
                    [DIR] sounds/
                        [DIR] furryos/
        [DIR] packages/
        [DIR] hooks/
            [DIR] normal/
                8000-remove-adjtime-configuration.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Reset generated file
                    | 
                    | cat > /etc/adjtime << EOF
                    | 0.0 0 0.0
                    | 0
                    | UTC
                    | EOF
                    --- CONTENT END ---
                8050-remove-openssh-server-host-keys.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove OpenSSH Host Keys.
                    | #
                    | # This removes openssh-server host keys, they are regenerated by live-config
                    | # on system start.
                    | 
                    | rm -f /etc/ssh/ssh_host_*_key /etc/ssh/ssh_host_*_key.pub
                    --- CONTENT END ---
                5050-dracut.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Adjust the remaining bits for dracut-live instead of initramfs-tools.
                    | 
                    | if [ ! -d /usr/share/doc/dracut-live ]
                    | then
                    | 	exit 0
                    | fi
                    | 
                    | # Get access to LB_PARENT_DISTRIBUTION_CHROOT
                    | . /live-build/config/bootstrap
                    | 
                    | # Remove remainder of initramfs-tools
                    | apt-get remove --purge --yes initramfs-tools
                    | # Remove live packages that work with initramfs-tools
                    | apt-get remove --purge --yes live-tools
                    | apt-get remove --purge --yes live-boot
                    | apt-get remove --purge --yes live-boot-initramfs-tools
                    | apt-get autoremove --yes
                    | 
                    | # Dracut mounts on /run/initramfs/live
                    | # d-i, calamares and debian-installer-launcher have /run/live/medium hardcoded
                    | # d-i -> fixed in live-build: installer_debian-installer
                    | # calamares -> fixed here
                    | # debian-installer-launcher -> probably not needed, is not part of the regular images
                    | 
                    | # Adjust the path for Calamares
                    | if [ -e /etc/calamares/modules/unpackfs.conf ]
                    | then
                    | 	sed --follow-symlinks -i -e 's|/run/live/medium|/run/initramfs/live|' /etc/calamares/modules/unpackfs.conf
                    | fi
                    | # Use dracut instead of initramfs-tools
                    | if [ -e /etc/calamares/settings.conf ]
                    | then
                    | 	sed --follow-symlinks -i -e '/initramfscfg/d;s/initramfs/dracut/' /etc/calamares/settings.conf
                    | fi
                    | # Add dracut-live to the list of packages to uninstall
                    | if [ -e /etc/calamares/modules/packages.conf ]
                    | then
                    | 	sed --follow-symlinks -i -e "s/'live-boot'/'dracut-live'/" /etc/calamares/modules/packages.conf
                    | fi
                    | # Calamares script for /etc/apt/sources.list during the installation
                    | SOURCES_MEDIA=/usr/share/calamares/helpers/calamares-sources-media
                    | if [ -e /usr/sbin/sources-media ]
                    | then
                    | 	# Until calamares-settings-debian 13.0.11 the filename was more generic
                    | 	SOURCES_MEDIA=/usr/sbin/sources-media
                    | fi
                    | if [ -e ${SOURCES_MEDIA} ]
                    | then
                    | 	sed -i -e 's|/run/live/medium|/run/initramfs/live|;s|/run/live|/run/initramfs|' ${SOURCES_MEDIA}
                    | 	sed -i -e "s|RELEASE=\".*\"|RELEASE=\"${LB_PARENT_DISTRIBUTION_CHROOT}\"|" ${SOURCES_MEDIA}
                    | fi
                    --- CONTENT END ---
                8010-remove-backup-files.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove backup files
                    | rm -f /boot/*.bak
                    | rm -f /boot/*.old-dkms
                    | 
                    | rm -f /etc/apt/sources.list~
                    | rm -f /etc/apt/trusted.gpg~
                    | 
                    | rm -f /etc/passwd-
                    | rm -f /etc/group-
                    | rm -f /etc/shadow-
                    | rm -f /etc/gshadow-
                    | 
                    | rm -f /var/cache/debconf/*-old
                    | rm -f /var/lib/dpkg/*-old
                    | 
                    | rm -f /usr/share/info/dir.old
                    --- CONTENT END ---
                8070-remove-temporary-files.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove temporary files
                    | for _DIRECTORY in /tmp /var/tmp
                    | do
                    | 	rm -rf ${_DIRECTORY}
                    | 
                    | 	mkdir -p ${_DIRECTORY}
                    | 	chmod 1777 ${_DIRECTORY}
                    | done
                    | 
                    | # Remove the old lock file which will be generated when needed
                    | rm -f /etc/.pwd.lock
                    | 
                    | # Remove /run/mount/utab of util-linux libmount (and its directory)
                    | # The file and directory will be generated when needed
                    | if [ -d /run/mount ]; then
                    | 	rm -f /run/mount/utab
                    | 	rmdir --ignore-fail-on-non-empty /run/mount
                    | fi
                    --- CONTENT END ---
                9010-remove-python-pyc.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove Python *.pyc files.
                    | #
                    | # This removes byte-compiled Python modules to save some space.
                    | 
                    | find /usr -name "*.pyc" -print0 | xargs -0r rm -f
                    --- CONTENT END ---
                5020-update-glx-alternative.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Use mesa renderer by default
                    | if [ -e /etc/alternatives/glx ]
                    | then
                    | 	update-alternatives --quiet --set glx /usr/lib/mesa-diverted
                    | fi
                    --- CONTENT END ---
                1000-create-mtab-symlink.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Create /etc/mtab symlink, replacing a regular file if necessary
                    | 
                    | if [ ! -L /etc/mtab ]
                    | then
                    | 	rm -f /etc/mtab
                    | 	ln -s /proc/mounts /etc/mtab
                    | fi
                    --- CONTENT END ---
                9020-remove-man-cache.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove the cache
                    | rm -rf /var/cache/man/*
                    --- CONTENT END ---
                9000-remove-gnome-icon-cache.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove GNOME icon cache.
                    | #
                    | # This saves some space.
                    | 
                    | rm -f /usr/share/icons/*/icon-theme.cache
                    --- CONTENT END ---
                8090-remove-ssl-cert-snakeoil.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove ssl-cert snakeoil
                    | 
                    | if [ -e /etc/ssl/certs/ssl-cert-snakeoil.pem ]
                    | then
                    | 	rm -f /etc/ssl/certs/$(openssl x509 -hash -noout -in /etc/ssl/certs/ssl-cert-snakeoil.pem)
                    | 
                    | 	rm -f /etc/ssl/certs/ssl-cert-snakeoil.pem
                    | 	rm -f /etc/ssl/private/ssl-cert-snakeoil.key
                    | fi
                    --- CONTENT END ---
                8030-truncate-log-files.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Truncate log files
                    | for _FILE in $(find /var/log/ -type f)
                    | do
                    | 	truncate --no-create --size=0 ${_FILE}
                    | done
                    --- CONTENT END ---
                8020-remove-dbus-machine-id.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove dbus machine id.
                    | #
                    | # This removes dbus machine id that cache that makes each system unique.
                    | 
                    | rm -f /var/lib/dbus/machine-id
                    --- CONTENT END ---
                8040-remove-mdadm-configuration.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove generated files
                    | 
                    | rm -f /etc/mdadm/mdadm.conf
                    --- CONTENT END ---
                8100-remove-udev-persistent-cd-rules.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove udev persistent rules.
                    | #
                    | # This removes udev persistent rules that cache the host systems cd drive as
                    | # well as the running live systems cd drive to remember its device name.
                    | 
                    | if [ -e /etc/udev/rules.d ]
                    | then
                    | 	> /etc/udev/rules.d/70-persistent-cd.rules
                    | fi
                    --- CONTENT END ---
                1010-enable-cryptsetup.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Enable cryptsetup
                    | 
                    | if [ -e /sbin/cryptsetup ]
                    | then
                    | 	if [ ! -e /etc/initramfs-tools/conf.d/cryptsetup ]
                    | 	then
                    | 		mkdir -p /etc/initramfs-tools/conf.d
                    | 
                    | 		cat > /etc/initramfs-tools/conf.d/cryptsetup <<-EOF
                    | 		# /etc/initramfs-tools/conf.d/cryptsetup
                    | 
                    | 		CRYPTSETUP=yes
                    | 		export CRYPTSETUP
                    | 		EOF
                    | 
                    | 	fi
                    | 
                    | 	if [ -e /etc/cryptsetup-initramfs/conf-hook ]; then
                    | 		if grep -q '^#CRYPTSETUP=' /etc/cryptsetup-initramfs/conf-hook; then
                    | 			sed -i -e 's/^#CRYPTSETUP=.*/CRYPTSETUP=y/' \
                    | 			    /etc/cryptsetup-initramfs/conf-hook
                    | 		else
                    | 			echo "CRYPTSETUP=y" >>/etc/cryptsetup-initramfs/conf-hook
                    | 		fi
                    | 	fi
                    | fi
                    --- CONTENT END ---
                5000-update-apt-file-cache.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Update the Apt File cache.
                    | #
                    | # This allows to use using apt-file out-of-the-box.
                    | 
                    | . /live-build/config/binary
                    | 
                    | if command -v apt-file >/dev/null && [ "${LB_APT_INDICES}" = "true" ]
                    | then
                    | 	apt-file update
                    | fi
                    --- CONTENT END ---
                8080-reproducible-glibc.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove the non-reproducible file that is created by ldconfig
                    | #
                    | # The file does not need to exist, see elf/cache.c:load_aux_cache
                    | # The file and folder will be recreated when needed, see elf/cache.c:save_aux_cache
                    | rm -fr /var/cache/ldconfig
                    --- CONTENT END ---
                8060-remove-systemd-machine-id.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Blank out systemd machine id. If it does not exist, systemd-journald
                    | # will fail, but if it exists and is empty, systemd will automatically
                    | # set up a new unique ID.
                    | 
                    | if [ -e /etc/machine-id ]
                    | then
                    | 	rm -f /etc/machine-id
                    | 	: > /etc/machine-id
                    | fi
                    --- CONTENT END ---
                5030-update-plocate-database.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Update the [mp]locate database.
                    | #
                    | # It is convenient for this to be already up to date on the live system, and it
                    | # means that if the live system is later installed to a hard disk then less
                    | # work will be required after installation.
                    | 
                    | # Up to Bullseye: mlocate
                    | if command -v updatedb.mlocate >/dev/null
                    | then
                    | 	updatedb.mlocate
                    | fi
                    | 
                    | # Bookworm and later: plocate
                    | if command -v updatedb.plocate >/dev/null
                    | then
                    | 	updatedb.plocate
                    | fi
                    --- CONTENT END ---
                8110-remove-udev-persistent-net-rules.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Remove udev persistent rules.
                    | #
                    | # This removes udev persistent rules that cache the host systems mac address to
                    | # remember its device name.
                    | 
                    | for _FILE in /etc/udev/rules.d/*persistent-net.rules
                    | do
                    | 	if [ -e "${_FILE}" ]
                    | 	then
                    | 		: > ${_FILE}
                    | 	fi
                    | done
                    --- CONTENT END ---
                1020-create-locales-files.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Create /etc/environment and /etc/default/locale
                    | touch /etc/environment
                    | echo "LANG=C.UTF-8" >/etc/default/locale
                    --- CONTENT END ---
                5040-update-nvidia-alternative.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Use newest nvidia version by default
                    | if [ -e /etc/alternatives/nvidia ] && [ -e /usr/lib/nvidia/current ]
                    | then
                    | 	update-alternatives --quiet --set nvidia /usr/lib/nvidia/current
                    | fi
                    --- CONTENT END ---
                5010-update-apt-xapian-index.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Update the Apt Xapian index.
                    | #
                    | # The package would do this itself, but (a) it checks policy-rc.d which says it
                    | # is not allowed to, and (b) it wants to build the index in the background which
                    | # will be racy in the context of live-build.
                    | 
                    | if command -v update-apt-xapian-index >/dev/null
                    | then
                    | 	PYTHONDONTWRITEBYTECODE=1 update-apt-xapian-index --force --quiet
                    | fi
                    --- CONTENT END ---
            [DIR] live/
                0050-disable-sysvinit-tmpfs.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Disable tmpfs on /tmp
                    | 
                    | if [ -e /etc/default/rcS ]
                    | then
                    | 	sed -i -e 's|^ *RAMTMP=.*|RAMTMP=no|' /etc/default/rcS
                    | fi
                    --- CONTENT END ---
                01-setup-zram.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | echo "🧠 Configuring ZRAM Swap (Genome Spec)..."
                    | apt-get install -y zram-tools
                    | echo "ALGO=zstd" >> /etc/default/zramswap
                    | echo "PERCENT=50" >> /etc/default/zramswap
                    --- CONTENT END ---
                03-mate-theme.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | echo "🎨 Applying FurryOS Branding to MATE..."
                    | mkdir -p /usr/share/glib-2.0/schemas/
                    | cat <<EOF > /usr/share/glib-2.0/schemas/99-furryos-mate.gschema.override
                    | [org.mate.background]
                    | picture-filename='/usr/share/backgrounds/furryos/wallpaper.jpg'
                    | 
                    | [org.mate.interface]
                    | gtk-theme='Menta'
                    | icon-theme='mate'
                    | font-name='Noto Sans 10'
                    | EOF
                    | glib-compile-schemas /usr/share/glib-2.0/schemas/
                    --- CONTENT END ---
                02-setup-power.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | echo "🔋 Configuring TLP Power Management..."
                    | apt-get install -y tlp tlp-rdw
                    | systemctl enable tlp
                    --- CONTENT END ---
                0010-disable-kexec-tools.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | 
                    | set -e
                    | 
                    | # Disable kexec-tools
                    | 
                    | if [ -e /sbin/kexec ]
                    | then
                    | 	echo "kexec-tools kexec-tools/load_kexec boolean false" > /root/preseed
                    | 
                    | 	debconf-set-selections /root/preseed
                    | 
                    | 	rm -f /root/preseed
                    | 
                    | 	dpkg-reconfigure kexec-tools
                    | fi
                    --- CONTENT END ---
                04-enable-server-services.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | echo "🔌 Enabling Server & Hypervisor Services..."
                    | systemctl enable ssh
                    | systemctl enable libvirtd
                    | systemctl enable cockpit.socket
                    --- CONTENT END ---
                01-furryos-branding.hook.chroot
                    --- CONTENT START ---
                    | #!/bin/sh
                    | echo "🦊 Hook: Applying FurryOS Identity..."
                    | # Update OS Release
                    | sed -i 's/PRETTY_NAME=.*/PRETTY_NAME="FurryOS (Rolling)"/g' /etc/os-release
                    | sed -i 's/NAME="Debian GNU\/Linux"/NAME="FurryOS"/g' /etc/os-release
                    | # Set Hostname
                    | echo "furryos" > /etc/hostname
                    | echo "127.0.1.1 furryos" >> /etc/hosts
                    --- CONTENT END ---
        [DIR] preseed/
        [DIR] debian-installer/
        [DIR] includes.installer/
        [DIR] includes/
        [DIR] includes.bootstrap/
        [DIR] package-lists/
            ⚠️  WARNING: 'neofetch' found in desktop.list.chroot
            desktop.list.chroot
                --- CONTENT START ---
                | task-gnome-desktop
                | firmware-linux
                | firmware-iwlwifi
                | firmware-misc-nonfree
                | neofetch
                | htop
                | curl
                | git
                | calamares
                | calamares-settings-debian
                | plymouth
                | plymouth-themes
                --- CONTENT END ---
            genome_generated.list.chroot
                --- CONTENT START ---
                | btrfs-progs
                | build-essential
                | calamares
                | calamares-settings-debian
                | dosfstools
                | firmware-iwlwifi
                | firmware-linux
                | firmware-misc-nonfree
                | gamemode
                | gimp
                | git
                | gparted
                | libreoffice
                | linux-image-amd64
                | lutris
                | mate-utils
                | obs-studio
                | pavucontrol
                | pipewire
                | pipewire-pulse
                | plymouth
                | plymouth-themes
                | python3
                | steam-installer
                | task-mate-desktop
                | tlp
                | vlc
                | wireplumber
                | zram-tools
                | # --- FURRYOS HYPERVISOR SUITE ---
                | qemu-system-x86
                | libvirt-daemon-system
                | libvirt-clients
                | virt-manager
                | bridge-utils
                | ovmf
                | 
                | # --- SERVER & WEB DASHBOARD ---
                | openssh-server
                | tmux
                | htop
                | cockpit
                | cockpit-machines
                | cockpit-podman
                | cockpit-storaged
                | cockpit-networkmanager
                --- CONTENT END ---
            live.list.chroot
                --- CONTENT START ---
                | live-boot
                | live-config
                | live-config-systemd
                | systemd-sysv
                --- CONTENT END ---
        [DIR] packages.chroot/
        [DIR] includes.binary/
        [DIR] rootfs/
    [DIR] auto/
    [DIR] scripts/
        activate_furryos.sh
            --- CONTENT START ---
            | #!/bin/bash
            | # Convenient wrapper to activate FurryOS venv
            | 
            | SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
            | source "$SCRIPT_DIR/furryos_venv/bin/activate"
            | 
            | echo "🐾 FurryOS venv activated!"
            | echo "Python: $(which python3)"
            | echo "Pip: $(which pip3)"
            | echo ""
            | echo "To deactivate: type 'deactivate'"
            --- CONTENT END ---
        fix_pip.sh
            --- CONTENT START ---
            | #!/bin/bash
            | rm -f /usr/lib/python3*/EXTERNALLY-MANAGED
            | echo "Pip Unchained!"
            --- CONTENT END ---
        close_all_windows.sh
            --- CONTENT START ---
            | #!/bin/bash
            | wmctrl -l | awk '{print $1}' | xargs -n1 wmctrl -ic
            --- CONTENT END ---
        translate_genome.py
            --- CONTENT START ---
            | import yaml
            | import os
            | import sys
            | 
            | # --- SMART PATH FINDING ---
            | def find_project_root():
            |     """Hunts for the root directory by looking for 'build.sh' or 'config'."""
            |     current_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up 3 levels to find the root
            |     for _ in range(3):
            |         if os.path.exists(os.path.join(current_dir, 'build.sh')):
            |             return current_dir
            |         current_dir = os.path.dirname(current_dir)
            |     
            |     # Fallback: assume the script is in /scripts/ and root is one level up
            |     return os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
            | 
            | ROOT_DIR = find_project_root()
            | CONFIG_DIR = os.path.join(ROOT_DIR, 'config')
            | YAML_FILE = os.path.join(CONFIG_DIR, 'GENOME.yaml')
            | 
            | print(f"📍 Project Root detected: {ROOT_DIR}")
            | print(f"📍 Config File target: {YAML_FILE}")
            | 
            | # --- TEMPLATES ---
            | ZRAM_HOOK = """#!/bin/sh
            | echo "🧠 Configuring ZRAM Swap (Genome Spec)..."
            | apt-get install -y zram-tools
            | echo "ALGO=zstd" >> /etc/default/zramswap
            | echo "PERCENT=50" >> /etc/default/zramswap
            | """
            | 
            | TLP_HOOK = """#!/bin/sh
            | echo "🔋 Configuring TLP Power Management..."
            | apt-get install -y tlp tlp-rdw
            | systemctl enable tlp
            | """
            | 
            | THEME_HOOK = """#!/bin/sh
            | echo "🎨 Applying FurryOS Branding to MATE..."
            | mkdir -p /usr/share/glib-2.0/schemas/
            | cat <<EOF > /usr/share/glib-2.0/schemas/99-furryos-mate.gschema.override
            | [org.mate.background]
            | picture-filename='/usr/share/backgrounds/furryos/wallpaper.jpg'
            | 
            | [org.mate.interface]
            | gtk-theme='Menta'
            | icon-theme='mate'
            | font-name='Noto Sans 10'
            | EOF
            | glib-compile-schemas /usr/share/glib-2.0/schemas/
            | """
            | 
            | # --- MAIN LOGIC ---
            | 
            | def load_yaml():
            |     if not os.path.exists(YAML_FILE):
            |         print(f"❌ Error: Could not find GENOME.yaml at {YAML_FILE}")
            |         print("   Please ensure the file exists in the 'config' folder in your project root.")
            |         sys.exit(1)
            |     with open(YAML_FILE, 'r') as f:
            |         return yaml.safe_load(f)
            | 
            | def generate_packages(data):
            |     print("📦 Generating Package List from GENOME.yaml...")
            |     packages = set()
            | 
            |     # 1. Base Essentials
            |     packages.add("task-mate-desktop")
            |     packages.add("mate-utils")
            |     packages.add("plymouth")
            |     packages.add("plymouth-themes")
            |     packages.add("calamares")
            |     packages.add("calamares-settings-debian")
            |     
            |     # 2. Hardware/Kernel
            |     packages.add("linux-image-amd64")
            |     packages.add("firmware-linux")
            |     packages.add("firmware-iwlwifi")
            |     packages.add("firmware-misc-nonfree")
            | 
            |     # 3. Features from YAML
            |     try:
            |         # Audio
            |         if data.get('taxonomy', {}).get('family', {}).get('audio_server') == 'pipewire':
            |             packages.update(["pipewire", "pipewire-pulse", "wireplumber", "pavucontrol"])
            | 
            |         # Storage
            |         packages.add("btrfs-progs")
            |         packages.add("gparted")
            |         packages.add("dosfstools")
            | 
            |         # Performance
            |         packages.add("tlp")
            |         packages.add("zram-tools")
            | 
            |         # Bundles
            |         bundles = data.get('installer', {}).get('wizard', {}).get('step4_packages', {}).get('bundles', {})
            |         
            |         if bundles.get('gaming'):
            |             packages.update(["steam-installer", "lutris", "gamemode"])
            |         
            |         if bundles.get('development'):
            |             packages.update(["git", "python3", "build-essential"])
            | 
            |         if bundles.get('multimedia'):
            |             packages.update(["gimp", "obs-studio", "vlc"])
            | 
            |         if bundles.get('office'):
            |             packages.add("libreoffice")
            | 
            |     except KeyError as e:
            |         print(f"⚠️  Warning: Missing expected key in YAML: {e}. Skipping some packages.")
            | 
            |     # Write
            |     pkg_path = os.path.join(CONFIG_DIR, 'package-lists', 'genome_generated.list.chroot')
            |     if not os.path.exists(os.path.dirname(pkg_path)):
            |         os.makedirs(os.path.dirname(pkg_path))
            |         
            |     with open(pkg_path, 'w') as f:
            |         f.write("\n".join(sorted(packages)))
            |     print(f"   ✅ Added {len(packages)} packages to {pkg_path}")
            | 
            | def generate_hooks(data):
            |     print("🪝 Generating System Hooks...")
            |     hook_dir = os.path.join(CONFIG_DIR, 'hooks', 'live')
            |     if not os.path.exists(hook_dir):
            |         os.makedirs(hook_dir)
            | 
            |     with open(os.path.join(hook_dir, '01-setup-zram.hook.chroot'), 'w') as f: f.write(ZRAM_HOOK)
            |     with open(os.path.join(hook_dir, '02-setup-power.hook.chroot'), 'w') as f: f.write(TLP_HOOK)
            |     with open(os.path.join(hook_dir, '03-mate-theme.hook.chroot'), 'w') as f: f.write(THEME_HOOK)
            |     
            |     for filename in os.listdir(hook_dir):
            |         os.chmod(os.path.join(hook_dir, filename), 0o755)
            |     print("   ✅ Hooks generated.")
            | 
            | if __name__ == "__main__":
            |     data = load_yaml()
            |     generate_packages(data)
            |     generate_hooks(data)
            |     print("\n🚀 Translation Complete! Your ISO will now include the GENOME specs.")
            --- CONTENT END ---
        deploy_iso.py
            --- CONTENT START ---
            | #!/usr/bin/env python3
            | """
            | ===============================================================================
            | FURRYOS DEPLOYER: UNIVERSAL ACCESS EDITION
            | ===============================================================================
            | 1. BOOT: Injects pre-extracted kernel files.
            | 2. DATA: Embeds 9GB ANTHROHEART library.
            | 3. USER EXP: Injects PDF Guide & Smart Symlinking (Library/Docs).
            | 4. BUILD: Generates Hybrid ISO.
            | ===============================================================================
            | """
            | import os
            | import shutil
            | import subprocess
            | import sys
            | from pathlib import Path
            | 
            | def find_api_key():
            |     """
            |     Intelligently hunts for the API key by walking up the directory tree.
            |     Works regardless of where this script is run from.
            |     """
            |     import os, sys
            |     filename = 'Gemini_API.key.txt'
            |     current_search_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up the tree (max 5 levels) to find the 'assets' folder
            |     for _ in range(5):
            |         potential_key = os.path.join(current_search_dir, 'assets', filename)
            |         if os.path.exists(potential_key):
            |             return potential_key
            |         
            |         # Move up one level
            |         parent_dir = os.path.dirname(current_search_dir)
            |         if parent_dir == current_search_dir: # We hit the root
            |             break
            |         current_search_dir = parent_dir
            |     
            |     # Fallback: Check Desktop
            |     desktop_fallback = os.path.expanduser('~/Desktop/Gemini_API.key.txt')
            |     if os.path.exists(desktop_fallback):
            |         return desktop_fallback
            | 
            |     print("❌ CRITICAL ERROR: Could not find 'Gemini_API.key.txt' anywhere.")
            |     sys.exit(1)
            | 
            | 
            | # --- Configuration ---
            | VERSION = "8.2.0-origin"
            | BUILD_DIR = Path("furryos_build")
            | ISO_WORK = BUILD_DIR / "iso_workspace"
            | OUTPUT_DIR = Path("output")
            | 
            | # Paths
            | KERNEL_SRC = Path("kernel")
            | ANTHROHEART_SRC = Path("ANTHROHEART")
            | ASSETS_DIR = Path("assets")
            | DOCS_SRC = Path("FurryOS_Complete_Documentation.pdf")
            | ISO_NAME = f"furryos-{VERSION}-x86_64.iso"
            | 
            | # Note: The original script provided does not contain existing API key loading logic
            | # (e.g., definitions of API_KEY_FILE or opening 'key.txt') to replace.
            | # The 'find_api_key' function is inserted as requested.
            | 
            | def run_cmd(cmd, desc):
            |     print(f"⚡ {desc}...")
            |     try:
            |         subprocess.run(cmd, shell=True, check=True)
            |     except subprocess.CalledProcessError as e:
            |         print(f"❌ Error: {e}")
            |         sys.exit(1)
            | 
            | def setup_workspace():
            |     print("🧹 Cleaning workspace...")
            |     if ISO_WORK.exists(): shutil.rmtree(ISO_WORK)
            | 
            |     dirs = [
            |         "boot/grub",
            |         "live",
            |         "furryos/bin",
            |         "furryos/assets",
            |         "furryos/scripts",
            |         "furryos/source",
            |         "furryos/docs"      # NEW: Documentation folder
            |     ]
            |     for d in dirs:
            |         (ISO_WORK / d).mkdir(parents=True, exist_ok=True)
            | 
            | def inject_kernel_files():
            |     print("🐧 Injecting Kernel from /TOP/kernel/...")
            |     required = ["vmlinuz", "initrd.img", "filesystem.squashfs"]
            |     for filename in required:
            |         src = KERNEL_SRC / filename
            |         dst = ISO_WORK / "live" / filename
            |         if not src.exists():
            |             print(f"❌ CRITICAL: {filename} missing in /TOP/kernel/")
            |             sys.exit(1)
            |         shutil.copy2(src, dst)
            | 
            | def copy_content():
            |     # 1. AnthroHeart Library
            |     if ANTHROHEART_SRC.exists():
            |         print(f"📦 Found ANTHROHEART Library! Copying...")
            |         dest = ISO_WORK / "furryos/ANTHROHEART"
            |         subprocess.run(f"rsync -a --info=progress2 '{ANTHROHEART_SRC}/' '{dest}/'", shell=True)
            | 
            |     # 2. PDF User Guide
            |     if DOCS_SRC.exists():
            |         print(f"📘 Embedding User Guide...")
            |         shutil.copy2(DOCS_SRC, ISO_WORK / "furryos/docs/FurryOS_User_Guide.pdf")
            |     else:
            |         print("⚠️  PDF Guide not found (Run compile_docs.py first).")
            | 
            | def inject_user_experience():
            |     print("🧠 Injecting Smart User Experience (Welcome Wagon)...")
            | 
            |     # This script runs on login to wire up the Library and Docs
            |     wagon_code = r'''#!/usr/bin/env python3
            | """
            | 🐾 FurryOS Welcome Wagon
            | - Links ANTHROHEART library to Home Folder
            | - Copies User Guide to Documents
            | - Checks Persistence
            | """
            | import os
            | import shutil
            | import subprocess
            | from pathlib import Path
            | 
            | HOME = Path.home()
            | ISO_ROOT = Path("/lib/live/mount/medium") # Standard Debian Live mount point
            | # Fallback if finding mount fails (search common mounts)
            | if not (ISO_ROOT / "furryos").exists():
            |     # Try finding where the ISO is mounted
            |     for root, dirs, files in os.walk("/run/media"):
            |         if "furryos" in dirs:
            |             ISO_ROOT = Path(root)
            |             break
            |     if not (ISO_ROOT / "furryos").exists():
            |         # Last ditch: check /run/live/medium
            |         ISO_ROOT = Path("/run/live/medium")
            | 
            | LIBRARY_SRC = ISO_ROOT / "furryos/ANTHROHEART"
            | DOCS_SRC = ISO_ROOT / "furryos/docs/FurryOS_User_Guide.pdf"
            | FLAG_FILE = HOME / ".config/furryos/setup_complete"
            | 
            | def setup_environment():
            |     # 1. Symlink the Library (Read-Only Access)
            |     lib_link = HOME / "ANTHROHEART_LIBRARY"
            |     if LIBRARY_SRC.exists() and not lib_link.exists():
            |         try:
            |             os.symlink(LIBRARY_SRC, lib_link)
            |             print("   Linked ANTHROHEART Library")
            |         except Exception as e: print(f"Link Error: {e}")
            | 
            |     # 2. Copy User Guide to Documents
            |     docs_dir = HOME / "Documents"
            |     docs_dir.mkdir(exist_ok=True)
            |     target_pdf = docs_dir / "FurryOS_User_Guide.pdf"
            | 
            |     if DOCS_SRC.exists() and not target_pdf.exists():
            |         try:
            |             shutil.copy2(DOCS_SRC, target_pdf)
            |             print("   Copied User Guide")
            |         except Exception as e: print(f"Copy Error: {e}")
            | 
            |     # 3. Create Config Flag
            |     (HOME / ".config/furryos").mkdir(parents=True, exist_ok=True)
            |     with open(FLAG_FILE, "w") as f: f.write("Setup Done")
            | 
            |     # 4. Show Welcome Notification
            |     if os.environ.get("DISPLAY"):
            |         subprocess.run(["notify-send", "FurryOS Ready", "Library linked & Guide in Documents!"])
            | 
            | if __name__ == "__main__":
            |     if not FLAG_FILE.exists():
            |         setup_environment()
            | '''
            | 
            |     # Save the script
            |     script_dest = ISO_WORK / "furryos/scripts/welcome_wagon.py"
            |     with open(script_dest, "w") as f:
            |         f.write(wagon_code)
            |     os.chmod(script_dest, 0o755)
            | 
            |     # Inject other tools
            |     if (ASSETS_DIR / "omni.py").exists():
            |         shutil.copy2(ASSETS_DIR / "omni.py", ISO_WORK / "furryos/bin/omni")
            | 
            |     # Embed Source
            |     src_dest = ISO_WORK / "furryos/source"
            |     ignore = shutil.ignore_patterns("furryos_build", "output", "*.iso", "venv", "kernel")
            |     if Path("assets").exists(): shutil.copytree("assets", src_dest / "assets", ignore=ignore)
            |     for f in ["quick_start.sh", "GENOME.yaml"]:
            |         if Path(f).exists(): shutil.copy2(f, src_dest)
            | 
            | def populate_binaries():
            |     print("📦 Copying Binaries...")
            |     src_bin = BUILD_DIR / "bin"
            |     if src_bin.exists():
            |         for f in src_bin.glob("*"): shutil.copy2(f, ISO_WORK / "furryos/bin")
            | 
            |     # Etcher
            |     etcher = list(ASSETS_DIR.glob("balenaEtcher*.AppImage"))
            |     if etcher: shutil.copy2(etcher[0], ISO_WORK / "furryos/assets")
            | 
            | def create_grub_config():
            |     print("📝 Creating GRUB Config...")
            |     cfg = r"""
            | set default=0
            | set timeout=5
            | menuentry "FurryOS Live (Desktop)" {
            |     linux /live/vmlinuz boot=live components quiet splash persistence username=anthro hostname=furryos
            |     initrd /live/initrd.img
            | }
            | menuentry "FurryOS Live (Safe)" {
            |     linux /live/vmlinuz boot=live components nomodeset username=anthro
            |     initrd /live/initrd.img
            | }
            | """
            |     with open(ISO_WORK / "boot/grub/grub.cfg", "w") as f:
            |         f.write(cfg)
            | 
            | def build_iso():
            |     print(f"\n💿 Building Final ISO: {ISO_NAME}...")
            |     OUTPUT_DIR.mkdir(parents=True, exist_ok=True)
            |     output_iso = OUTPUT_DIR / ISO_NAME
            | 
            |     cmd = f"grub-mkrescue -o '{output_iso}' {ISO_WORK} -- -volid 'FURRYOS_LIVE'"
            |     try:
            |         run_cmd(cmd, "Generating Hybrid ISO")
            |     except:
            |         print("❌ Build failed. Install grub-common/xorriso/mtools")
            |         sys.exit(1)
            | 
            |     if output_iso.exists():
            |         print(f"\n✅ SUCCESS! ISO Created: {output_iso}")
            | 
            | def main():
            |     setup_workspace()
            |     inject_kernel_files()
            |     copy_content()
            |     inject_user_experience() # <--- Wires up Library/Docs
            |     populate_binaries()
            |     create_grub_config()
            |     build_iso()
            | 
            | if __name__ == "__main__":
            |     main()
            --- CONTENT END ---
        TIMESTAMPER.py
            --- CONTENT START ---
            | import socket
            | import struct
            | import time
            | from datetime import datetime, timezone
            | 
            | def get_ntp_time_with_telemetry(server='time.google.com'):
            |     """Get time from NTP server with detailed telemetry"""
            |     NTP_PACKET_FORMAT = "!12I"
            |     NTP_DELTA = 2208988800  # 1970-01-01 00:00:00
            |     NTP_QUERY = b'\x1b' + 47 * b'\0'
            |     
            |     telemetry = {
            |         'server_hostname': server,
            |         'server_ip': None,
            |         'response_time_ms': None,
            |         'stratum': None,
            |         'precision': None,
            |         'root_delay': None,
            |         'success': False,
            |         'error': None
            |     }
            |     
            |     try:
            |         # Resolve server IP
            |         telemetry['server_ip'] = socket.gethostbyname(server)
            |         
            |         with socket.socket(socket.AF_INET, socket.SOCK_DGRAM) as s:
            |             s.settimeout(5)
            |             
            |             # Measure response time
            |             start_time = time.time()
            |             s.sendto(NTP_QUERY, (server, 123))
            |             msg, address = s.recvfrom(1024)
            |             end_time = time.time()
            |             
            |             telemetry['response_time_ms'] = (end_time - start_time) * 1000
            |         
            |         unpacked = struct.unpack(NTP_PACKET_FORMAT, msg[0:struct.calcsize(NTP_PACKET_FORMAT)])
            |         
            |         # Extract NTP packet details
            |         leap_indicator = (msg[0] >> 6) & 0x3
            |         version = (msg[0] >> 3) & 0x7
            |         mode = msg[0] & 0x7
            |         telemetry['stratum'] = msg[1]
            |         telemetry['precision'] = struct.unpack('!b', bytes([msg[2]]))[0]
            |         telemetry['root_delay'] = unpacked[1] / 2**16
            |         
            |         ntp_time = unpacked[10] + float(unpacked[11]) / 2**32
            |         epoch_time = ntp_time - NTP_DELTA
            |         
            |         telemetry['success'] = True
            |         telemetry['leap_indicator'] = leap_indicator
            |         telemetry['version'] = version
            |         telemetry['mode'] = mode
            |         
            |         return epoch_time, telemetry
            |     except Exception as e:
            |         telemetry['error'] = str(e)
            |         print(f"NTP query failed: {e}")
            |         print("Falling back to system time...")
            |         return time.time(), telemetry
            | 
            | # Get time from NTP server
            | ntp_server = 'time.google.com'
            | epoch_timestamp, telemetry = get_ntp_time_with_telemetry(ntp_server)
            | 
            | # Create datetime object in UTC
            | dt_utc = datetime.fromtimestamp(epoch_timestamp, tz=timezone.utc)
            | 
            | # Build telemetry section
            | telemetry_section = f"""=== NTP SERVER TELEMETRY ===
            | Server Hostname: {telemetry['server_hostname']}
            | Server IP Address: {telemetry['server_ip'] if telemetry['server_ip'] else 'N/A'}
            | Query Success: {'Yes' if telemetry['success'] else 'No (using system time)'}
            | Response Time: {f"{telemetry['response_time_ms']:.2f} ms" if telemetry['response_time_ms'] else 'N/A'}
            | """
            | 
            | if telemetry['success']:
            |     telemetry_section += f"""Stratum Level: {telemetry['stratum']} (distance from reference clock)
            | Precision: {telemetry['precision']} (log2 seconds)
            | Root Delay: {telemetry['root_delay']:.6f} seconds
            | NTP Version: {telemetry.get('version', 'N/A')}
            | Leap Indicator: {telemetry.get('leap_indicator', 'N/A')}
            | """
            | else:
            |     telemetry_section += f"Error Details: {telemetry['error']}\n"
            | 
            | # Prepare full timestamp data
            | timestamp_data = f"""TIMESTAMP ARCHIVE FILE
            | Generated: {datetime.now(timezone.utc).strftime('%Y-%m-%d %H:%M:%S UTC')}
            | Query Time: {datetime.now(timezone.utc).isoformat()}
            | 
            | {telemetry_section}
            | === EPOCH FORMATS ===
            | Unix Epoch (seconds): {int(epoch_timestamp)}
            | Unix Epoch (milliseconds): {int(epoch_timestamp * 1000)}
            | Unix Epoch (microseconds): {int(epoch_timestamp * 1000000)}
            | Precise Epoch: {epoch_timestamp:.6f}
            | 
            | === HUMAN READABLE FORMATS ===
            | ISO 8601 Format: {dt_utc.isoformat()}
            | RFC 2822 Format: {dt_utc.strftime('%a, %d %b %Y %H:%M:%S +0000')}
            | Standard Format: {dt_utc.strftime('%Y-%m-%d %H:%M:%S UTC')}
            | Long Format: {dt_utc.strftime('%A, %B %d, %Y at %H:%M:%S UTC')}
            | Compact Format: {dt_utc.strftime('%Y%m%d_%H%M%S')}
            | 
            | === COMPONENT BREAKDOWN ===
            | Year: {dt_utc.year}
            | Month: {dt_utc.month:02d} ({dt_utc.strftime('%B')})
            | Day: {dt_utc.day:02d} ({dt_utc.strftime('%A')})
            | Hour: {dt_utc.hour:02d}
            | Minute: {dt_utc.minute:02d}
            | Second: {dt_utc.second:02d}
            | Microsecond: {dt_utc.microsecond}
            | """
            | 
            | # Write to file
            | filename = 'TIMESTAMP.txt'
            | with open(filename, 'w', encoding='utf-8') as f:
            |     f.write(timestamp_data)
            | 
            | print(f"✓ SUCCESS: Timestamp file written!")
            | print(f"✓ File: {filename}")
            | print(f"✓ Location: Current working directory")
            | print(f"\n--- Telemetry Summary ---")
            | print(f"Server: {telemetry['server_hostname']} ({telemetry['server_ip']})")
            | print(f"Response Time: {telemetry['response_time_ms']:.2f} ms" if telemetry['response_time_ms'] else "N/A")
            | print(f"Stratum: {telemetry['stratum']}" if telemetry.get('stratum') else "N/A")
            --- CONTENT END ---
        omni.py
            --- CONTENT START ---
            | #!/usr/bin/env python3
            | import os, sys, subprocess
            | # [Omni Tool Stub - Full code in patches]
            | print("Omni 1.0 Active")
            --- CONTENT END ---
        upgrade_project.py
            --- CONTENT START ---
            --- CONTENT END ---
        setup_venv.sh
            --- CONTENT START ---
            | #!/bin/bash
            | # setup_venv.sh - Creates isolated Python environment for FurryOS build system
            | # Location: /TOP/setup_venv.sh
            | # This venv can be distributed WITH the ISO for offline builds
            | 
            | set -e
            | 
            | VENV_DIR="furryos_venv"
            | PYTHON_VERSION=$(python3 --version | cut -d' ' -f2 | cut -d'.' -f1,2)
            | 
            | echo "========================================"
            | echo "   🐾 FurryOS venv Setup 🐾"
            | echo "========================================"
            | echo ""
            | 
            | # Check if already exists
            | if [ -d "$VENV_DIR" ]; then
            |     echo "⚠️  venv already exists at $VENV_DIR"
            |     read -p "Remove and recreate? [y/N]: " -n 1 -r
            |     echo
            |     if [[ $REPLY =~ ^[Yy]$ ]]; then
            |         echo "🗑️  Removing old venv..."
            |         rm -rf "$VENV_DIR"
            |     else
            |         echo "✓ Using existing venv"
            |         exit 0
            |     fi
            | fi
            | 
            | # Create venv
            | echo "📦 Creating Python $PYTHON_VERSION virtual environment..."
            | python3 -m venv "$VENV_DIR"
            | 
            | # Activate venv
            | source "$VENV_DIR/bin/activate"
            | 
            | # Upgrade pip
            | echo "⬆️  Upgrading pip..."
            | pip install --upgrade pip setuptools wheel
            | 
            | # Install ALL FurryOS dependencies
            | echo "📥 Installing FurryOS dependencies..."
            | echo "   This may take 2-5 minutes..."
            | 
            | # Core dependencies
            | pip install pyyaml
            | pip install requests
            | pip install pillow
            | pip install mutagen
            | pip install cryptography
            | pip install jinja2
            | 
            | # Build tools
            | pip install pipreqs
            | 
            | # Optional but useful
            | pip install python-magic-bin 2>/dev/null || pip install python-magic 2>/dev/null || true
            | 
            | echo ""
            | echo "✅ All packages installed successfully!"
            | 
            | # Create activation wrapper
            | cat > activate_furryos.sh << 'WRAPPER'
            | #!/bin/bash
            | # Convenient wrapper to activate FurryOS venv
            | 
            | SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
            | source "$SCRIPT_DIR/furryos_venv/bin/activate"
            | 
            | echo "🐾 FurryOS venv activated!"
            | echo "Python: $(which python3)"
            | echo "Pip: $(which pip3)"
            | echo ""
            | echo "To deactivate: type 'deactivate'"
            | WRAPPER
            | 
            | chmod +x activate_furryos.sh
            | 
            | # Create requirements.txt
            | pip freeze > "$VENV_DIR/requirements.txt"
            | 
            | # Also create a clean requirements.txt in /TOP
            | cat > requirements.txt << 'REQS'
            | # FurryOS Build System Requirements
            | # Install with: pip install -r requirements.txt
            | 
            | # Core Framework
            | pyyaml>=6.0
            | requests>=2.31.0
            | 
            | # Media Processing
            | pillow>=10.0.0
            | mutagen>=1.47.0
            | 
            | # Cryptography & Signing
            | cryptography>=41.0.0
            | 
            | # Template Engine
            | jinja2>=3.1.2
            | 
            | # Build Tools
            | pipreqs>=0.5.0
            | 
            | # Additional Requirements
            | sudo apt-get install genisoimage xorriso grub-pc-bin grub-efi-amd64-bin
            | 
            | # Optional Dependencies
            | python-magic-bin>=0.4.14; platform_system == "Windows"
            | python-magic>=0.4.27; platform_system != "Windows"
            | REQS
            | 
            | echo "✓ requirements.txt created in /TOP and venv/"
            | 
            | # Deactivate
            | deactivate
            | 
            | echo ""
            | echo "✅ FurryOS venv created successfully!"
            | echo ""
            | echo "📍 Location: $(pwd)/$VENV_DIR"
            | echo "📦 Packages installed:"
            | cat "$VENV_DIR/requirements.txt" | wc -l
            | echo ""
            | echo "🔐 cryptography package: INSTALLED"
            | echo "   (Ed25519 signing ready)"
            | echo ""
            | echo "🚀 Usage:"
            | echo "   Method 1 (recommended):"
            | echo "      source activate_furryos.sh"
            | echo ""
            | echo "   Method 2 (manual):"
            | echo "      source $VENV_DIR/bin/activate"
            | echo ""
            | echo "   Method 3 (scripts do it automatically):"
            | echo "      Just run: ./quick_start.sh"
            | echo "      (scripts detect and use venv if available)"
            | echo ""
            | echo "📦 To bundle with ISO:"
            | echo "   tar -czf furryos_venv.tar.gz furryos_venv/"
            | echo ""
            | echo "🐾 Go touch grass; venv setup complete! 🌱"
            --- CONTENT END ---
        enhance_configs.py
            --- CONTENT START ---
            | import os
            | import time
            | import google.generativeai as genai
            | import sys
            | 
            | # --- SMART CONFIGURATION ---
            | 
            | def find_api_key():
            |     """
            |     Intelligently hunts for the API key by walking up the directory tree.
            |     Works regardless of where this script is run from.
            |     """
            |     filename = 'Gemini_API.key.txt'
            |     
            |     # Start where the script lives
            |     current_search_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up the tree (max 5 levels) to find the 'assets' folder
            |     for _ in range(5):
            |         potential_key = os.path.join(current_search_dir, 'assets', filename)
            |         
            |         if os.path.exists(potential_key):
            |             print(f"🔑 Found API Key at: {potential_key}")
            |             return potential_key
            |         
            |         # Move up one level
            |         parent_dir = os.path.dirname(current_search_dir)
            |         if parent_dir == current_search_dir: # We hit the root of the drive
            |             break
            |         current_search_dir = parent_dir
            |         
            |     # Emergency fallback: Check the specific Desktop path seen in your screenshots
            |     desktop_fallback = '/home/anthro/Desktop/Gemini_API.key.txt'
            |     if os.path.exists(desktop_fallback):
            |         print(f"🔑 Found API Key on Desktop: {desktop_fallback}")
            |         return desktop_fallback
            | 
            |     print("❌ CRITICAL ERROR: Could not find 'Gemini_API.key.txt' anywhere.")
            |     print("   Please ensure it is in an 'assets' folder inside your project.")
            |     sys.exit(1)
            | 
            | def find_config_dir():
            |     """Finds the config directory relative to the found key or script."""
            |     script_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Try generic relative path first
            |     relative_config = os.path.abspath(os.path.join(script_dir, '..', 'config'))
            |     if os.path.exists(relative_config):
            |         return relative_config
            |         
            |     # If that fails, look in current dir
            |     if os.path.exists('config'):
            |         return os.path.abspath('config')
            |         
            |     print("⚠️  Warning: Could not locate 'config' folder.")
            |     return None
            | 
            | # --- AI LOGIC ---
            | 
            | SYSTEM_PROMPT = """
            | You are a Senior Linux Distro Engineer.
            | Your Goal: READ the user's config file (YAML/JSON) and SUPERCHARGE it.
            | 1. ANALYZE: Look for missing modern features (Btrfs, ZRAM, Wayland, Theming).
            | 2. EXPAND: Add these features directly to the file.
            | 3. OUTPUT: Return ONLY the valid, upgraded file content.
            | """
            | 
            | def get_api_key_content(path):
            |     try:
            |         with open(path, 'r') as f:
            |             return f.read().strip()
            |     except Exception as e:
            |         print(f"❌ Error reading key file: {e}")
            |         sys.exit(1)
            | 
            | def enhance_file(model, file_path):
            |     print(f"🧠 Enhancing: {os.path.basename(file_path)}...")
            |     with open(file_path, 'r', encoding='utf-8') as f:
            |         content = f.read()
            |     
            |     # Skip empty or tiny files
            |     if len(content) < 10: 
            |         return
            | 
            |     try:
            |         response = model.generate_content(f"{SYSTEM_PROMPT}\n\nFILE CONTENT:\n{content}")
            |         if response.text:
            |             new_content = response.text.replace("```yaml", "").replace("```json", "").replace("```", "").strip()
            |             
            |             # Save backup
            |             os.rename(file_path, file_path + ".original")
            |             
            |             # Write new
            |             with open(file_path, 'w', encoding='utf-8') as f:
            |                 f.write(new_content)
            |             print(f"   ✅ Upgraded!")
            |     except Exception as e:
            |         print(f"   ⚠️ Failed: {e}")
            | 
            | if __name__ == "__main__":
            |     # 1. Locate Resources
            |     key_path = find_api_key()
            |     config_dir = find_config_dir()
            |     
            |     if not config_dir:
            |         print("Nothing to enhance (Config folder missing). Exiting.")
            |         sys.exit(0)
            | 
            |     # 2. Setup AI
            |     key = get_api_key_content(key_path)
            |     genai.configure(api_key=key)
            |     
            |     # Try to find the best model (fallback logic included)
            |     model = genai.GenerativeModel('gemini-2.5-flash')
            |     
            |     # 3. Execute
            |     print(f"📂 Scanning for configs in: {config_dir}")
            |     count = 0
            |     for root, dirs, files in os.walk(config_dir):
            |         for file in files:
            |             if file.endswith(('.yaml', '.yml', '.json', '.conf')):
            |                 enhance_file(model, os.path.join(root, file))
            |                 count += 1
            |                 time.sleep(1) # Polite rate limiting
            |     
            |     if count == 0:
            |         print("   No .yaml/.json/.conf files found to enhance.")
            |     else:
            |         print(f"\n✨ Processed {count} configuration files.")
            --- CONTENT END ---
        ANCHOR-TO-BITCOIN.py
            --- CONTENT START ---
            | #!/usr/bin/env python3
            | """
            | ===============================================================================
            |  AnthroHeart VPS Anchor (Bitcoin Strategy)
            | ===============================================================================
            | Purpose:
            |   Links your VPS file to the Bitcoin Blockchain for FREE.
            |   - Verifies the file hash.
            |   - Signs it with your Christmas Identity Key.
            |   - Creates a "Release Manifest" pointing to your IP.
            |   - Stamps the Manifest with OpenTimestamps (Bitcoin).
            | ===============================================================================
            | """
            | 
            | import os
            | import sys
            | import json
            | import hashlib
            | import subprocess
            | import platform
            | from datetime import datetime, timezone
            | 
            | # ---------------------------------------------------------------------------
            | # 1. SETUP
            | # ---------------------------------------------------------------------------
            | VENV_DIR = ".venv"
            | REQUIRED = ["pynacl", "opentimestamps-client"]
            | 
            | def bootstrap():
            |     if platform.system() == "Windows":
            |         py = os.path.join(VENV_DIR, "Scripts", "python.exe")
            |     else:
            |         py = os.path.join(VENV_DIR, "bin", "python")
            | 
            |     if not os.path.exists(VENV_DIR):
            |         subprocess.check_call([sys.executable, "-m", "venv", VENV_DIR])
            | 
            |     if sys.prefix == sys.base_prefix:
            |         subprocess.check_call([py, "-m", "pip", "install", *REQUIRED, "-q"])
            |         subprocess.check_call([py] + sys.argv)
            |         sys.exit(0)
            | 
            | bootstrap()
            | 
            | from nacl.signing import SigningKey
            | from nacl.encoding import HexEncoder
            | 
            | # --- CONFIGURATION ---
            | FILE_PATH = "The_AnthroHeart_Collection_Bundle.7z"
            | IDENTITY_KEY = "anthroheart_chain/anthro_identity.key"
            | PUBLIC_KEY = "anthroheart_chain/anthro_public.key"
            | VPS_URL = os.getenv(
            |     "ANTHROHEART_ORIGIN_URL",
            |     "https://torrent.anthroentertainment.com/The_AnthroHeart_Collection_Bundle.7z"
            | )
            | 
            | OUT_DIR = "anthroheart_chain"
            | 
            | def main():
            |     # A. CHECKS
            |     if not os.path.exists(FILE_PATH):
            |         sys.exit(f"❌ Error: {FILE_PATH} not found.")
            |     if not os.path.exists(IDENTITY_KEY):
            |         sys.exit(f"❌ Error: {IDENTITY_KEY} not found.")
            | 
            |     # B. LOAD IDENTITY
            |     print(f"🔑 Loading Identity...")
            |     with open(IDENTITY_KEY, "r") as f:
            |         signing_key = SigningKey(f.read().strip(), encoder=HexEncoder)
            |         public_key_hex = signing_key.verify_key.encode(encoder=HexEncoder).decode()
            | 
            |     # C. HASH FILE
            |     print(f"⚙️  Hashing 5GB file (Verification)...")
            |     sha256, sha512 = hashlib.sha256(), hashlib.sha512()
            |     
            |     with open(FILE_PATH, "rb") as f:
            |         while chunk := f.read(16 * 1024 * 1024):
            |             sha256.update(chunk)
            |             sha512.update(chunk)
            |             print(".", end="", flush=True)
            |     
            |     h256, h512 = sha256.hexdigest(), sha512.hexdigest()
            |     print(f"\n✅ Hash Verified: {h256[:16]}...")
            | 
            |     # D. SIGNATURE
            |     # Sign the Hash + URL to lock them together
            |     payload = f"{h512}|{VPS_URL}".encode()
            |     signature = signing_key.sign(payload).signature.hex()
            | 
            |     # E. CREATE RELEASE MANIFEST
            |     manifest = {
            |         "record_type": "Public_Release_Anchor",
            |         "timestamp": datetime.now(timezone.utc).isoformat(),
            |         "hosting": {
            |             "url": VPS_URL,
            |             "method": "Self-Hosted VPS (Nginx)",
            |             "note": "Primary Origin"
            |         },
            |         "integrity": {
            |             "filename": os.path.basename(FILE_PATH),
            |             "filesize": os.path.getsize(FILE_PATH),
            |             "sha256": h256,
            |             "sha512": h512
            |         },
            |         "authorization": {
            |             "signer_public_key": public_key_hex,
            |             "signature": signature,
            |             "signed_string": "sha512|url"
            |         }
            |     }
            | 
            |     manifest_filename = f"release_proof_{datetime.now().strftime('%Y%m%d')}.json"
            |     manifest_path = os.path.join(OUT_DIR, manifest_filename)
            | 
            |     with open(manifest_path, "w") as f:
            |         json.dump(manifest, f, indent=2, sort_keys=True)
            | 
            |     # F. STAMP TO BITCOIN BLOCKCHAIN
            |     print(f"\n⏳ Anchoring to Bitcoin via OpenTimestamps...")
            |     
            |     if platform.system() == "Windows":
            |         ots_exec = os.path.join(sys.prefix, "Scripts", "ots.exe")
            |     else:
            |         ots_exec = os.path.join(sys.prefix, "bin", "ots")
            |         if not os.path.exists(ots_exec): ots_exec = "ots"
            | 
            |     try:
            |         subprocess.check_call([ots_exec, "stamp", manifest_path])
            |         print("\n" + "="*60)
            |         print(" ✅ SUCCESS: RELEASE ANCHORED TO BITCOIN")
            |         print("="*60)
            |         print(f" 1. Manifest created: {manifest_path}")
            |         print(f" 2. Bitcoin Proof:    {manifest_path}.ots")
            |         print("-" * 60)
            |         print(" HOW THIS WORKS:")
            |         print(" You now have a file on your VPS, and a cryptographic proof")
            |         print(" on your computer that links that SPECIFIC file URL to the")
            |         print(" Bitcoin blockchain forever.")
            |         print("-" * 60)
            |         print(" NEXT STEP: Upload these two small files (.json and .ots)")
            |         print(" to your VPS folder next to the 7z file so geeks can verify it.")
            |         print("="*60)
            |     except Exception as e:
            |         print(f"❌ OTS Error: {e}")
            | 
            | if __name__ == "__main__":
            |     main()
            --- CONTENT END ---
        furryos-migrate.sh
            --- CONTENT START ---
            | #!/bin/bash
            | ################################################################################
            | # FurryOS Persistent Data Backup & Restore Script
            | ################################################################################
            | # This script helps transfer your persistent USB data to a full installation
            | ################################################################################
            | 
            | set -e
            | 
            | VERSION="8.0.0-origin"
            | SCRIPT_NAME="furryos-migrate.sh"
            | 
            | # Colors
            | RED='\033[0;31m'
            | GREEN='\033[0;32m'
            | YELLOW='\033[1;33m'
            | BLUE='\033[0;34m'
            | NC='\033[0m' # No Color
            | 
            | banner() {
            |     echo ""
            |     echo "================================================================================"
            |     echo "   🐾 FURRYOS PERSISTENT DATA MIGRATION TOOL 🐾"
            |     echo "   Transfer your settings from USB to full install"
            |     echo "   Version: $VERSION"
            |     echo "================================================================================"
            |     echo ""
            | }
            | 
            | check_root() {
            |     if [ "$EUID" -ne 0 ]; then 
            |         echo "${RED}❌ Please run as root (use sudo)${NC}"
            |         exit 1
            |     fi
            | }
            | 
            | mode_select() {
            |     echo "${BLUE}Select mode:${NC}"
            |     echo "  1) Backup persistent USB data"
            |     echo "  2) Restore to full installation"
            |     echo "  3) Auto-migrate (backup + restore in one step)"
            |     echo ""
            |     read -p "Enter choice [1-3]: " mode
            |     echo ""
            | }
            | 
            | backup_persistent() {
            |     echo "${GREEN}[BACKUP MODE]${NC}"
            |     echo ""
            | 
            |     # Detect persistence partition
            |     PERSIST_PART=$(findmnt -n -o SOURCE /lib/live/mount/persistence 2>/dev/null ||                    findmnt -n -o SOURCE / 2>/dev/null | sed 's/[0-9]*$/3/')
            | 
            |     if [ -z "$PERSIST_PART" ]; then
            |         echo "${YELLOW}⚠️  Cannot auto-detect persistence partition${NC}"
            |         read -p "Enter persistence partition (e.g., /dev/sdb3): " PERSIST_PART
            |     fi
            | 
            |     echo "📀 Persistence partition: $PERSIST_PART"
            |     echo ""
            | 
            |     # Mount if needed
            |     if ! mountpoint -q /mnt/persistence 2>/dev/null; then
            |         echo "📂 Mounting persistence partition..."
            |         mkdir -p /mnt/persistence
            |         mount $PERSIST_PART /mnt/persistence
            |         MOUNTED=1
            |     fi
            | 
            |     # Backup location
            |     BACKUP_DIR="/tmp/furryos_backup_$(date +%Y%m%d_%H%M%S)"
            |     mkdir -p "$BACKUP_DIR"
            | 
            |     echo "${GREEN}🔄 Backing up data to: $BACKUP_DIR${NC}"
            |     echo ""
            | 
            |     # Backup home directory
            |     if [ -d "/mnt/persistence/home" ]; then
            |         echo "  📁 Backing up /home..."
            |         rsync -ah --info=progress2 /mnt/persistence/home/ "$BACKUP_DIR/home/"
            |     else
            |         echo "  📁 Backing up /home..."
            |         rsync -ah --info=progress2 /home/ "$BACKUP_DIR/home/"
            |     fi
            | 
            |     # Backup important configs
            |     echo "  ⚙️  Backing up configs..."
            |     mkdir -p "$BACKUP_DIR/etc"
            | 
            |     # Safe configs to backup
            |     for config in hostname hosts network/interfaces NetworkManager ssh; do
            |         if [ -e "/etc/$config" ]; then
            |             cp -a "/etc/$config" "$BACKUP_DIR/etc/" 2>/dev/null || true
            |         fi
            |     done
            | 
            |     # Backup installed packages list
            |     echo "  📦 Backing up package list..."
            |     dpkg --get-selections > "$BACKUP_DIR/packages.list"
            | 
            |     # Backup ANTHROHEART playlists and favorites
            |     if [ -d "/home" ]; then
            |         echo "  🎨 Backing up ANTHROHEART user data..."
            |         find /home -type f \( -name "*.m3u" -o -name "*.pls" \) -exec cp --parents {} "$BACKUP_DIR/" \; 2>/dev/null || true
            |     fi
            | 
            |     # Create tarball
            |     echo ""
            |     echo "  📦 Creating backup archive..."
            |     TARBALL="/tmp/furryos-backup-$(date +%Y%m%d_%H%M%S).tar.gz"
            |     tar -czf "$TARBALL" -C "$BACKUP_DIR" .
            | 
            |     # Unmount if we mounted it
            |     if [ "$MOUNTED" = "1" ]; then
            |         umount /mnt/persistence
            |     fi
            | 
            |     echo ""
            |     echo "${GREEN}✅ Backup complete!${NC}"
            |     echo ""
            |     echo "📦 Backup archive: $TARBALL"
            |     echo "📊 Size: $(du -h $TARBALL | cut -f1)"
            |     echo ""
            |     echo "💾 Copy this file to:"
            |     echo "   • External drive"
            |     echo "   • Cloud storage"
            |     echo "   • Network location"
            |     echo ""
            |     echo "🔄 Then boot your full install and run:"
            |     echo "   sudo $SCRIPT_NAME"
            |     echo "   Choose option 2 (Restore)"
            |     echo ""
            | }
            | 
            | restore_to_install() {
            |     echo "${GREEN}[RESTORE MODE]${NC}"
            |     echo ""
            | 
            |     # Find backup
            |     echo "🔍 Looking for backup archives..."
            |     BACKUPS=$(find /tmp /media /mnt -name "furryos-backup-*.tar.gz" 2>/dev/null || true)
            | 
            |     if [ -z "$BACKUPS" ]; then
            |         echo "${YELLOW}⚠️  No backup archives found${NC}"
            |         read -p "Enter path to backup archive: " TARBALL
            |     else
            |         echo "Found backups:"
            |         select TARBALL in $BACKUPS "Enter path manually"; do
            |             if [ "$TARBALL" = "Enter path manually" ]; then
            |                 read -p "Enter path to backup archive: " TARBALL
            |             fi
            |             break
            |         done
            |     fi
            | 
            |     if [ ! -f "$TARBALL" ]; then
            |         echo "${RED}❌ Backup file not found: $TARBALL${NC}"
            |         exit 1
            |     fi
            | 
            |     echo "📦 Using backup: $TARBALL"
            |     echo ""
            | 
            |     # Extract to temp
            |     RESTORE_DIR="/tmp/furryos_restore_$(date +%Y%m%d_%H%M%S)"
            |     mkdir -p "$RESTORE_DIR"
            | 
            |     echo "📂 Extracting backup..."
            |     tar -xzf "$TARBALL" -C "$RESTORE_DIR"
            | 
            |     # Restore home directory
            |     if [ -d "$RESTORE_DIR/home" ]; then
            |         echo ""
            |         echo "${GREEN}🏠 Restoring home directory...${NC}"
            |         rsync -ah --info=progress2 "$RESTORE_DIR/home/" /home/
            |         echo "  ✓ Home directory restored"
            |     fi
            | 
            |     # Restore configs (carefully)
            |     if [ -d "$RESTORE_DIR/etc" ]; then
            |         echo ""
            |         echo "${GREEN}⚙️  Restoring configs...${NC}"
            |         echo "${YELLOW}⚠️  Review these changes carefully!${NC}"
            | 
            |         for config in "$RESTORE_DIR/etc"/*; do
            |             if [ -e "$config" ]; then
            |                 basename=$(basename "$config")
            |                 echo "  📝 Restore /etc/$basename? [y/N]"
            |                 read -n 1 -r
            |                 echo
            |                 if [[ $REPLY =~ ^[Yy]$ ]]; then
            |                     cp -a "$config" "/etc/"
            |                     echo "    ✓ Restored"
            |                 else
            |                     echo "    ⏭️  Skipped"
            |                 fi
            |             fi
            |         done
            |     fi
            | 
            |     # Restore packages
            |     if [ -f "$RESTORE_DIR/packages.list" ]; then
            |         echo ""
            |         echo "${GREEN}📦 Restore installed packages? [y/N]${NC}"
            |         read -n 1 -r
            |         echo
            |         if [[ $REPLY =~ ^[Yy]$ ]]; then
            |             echo "  📥 Installing packages (this may take a while)..."
            |             dpkg --set-selections < "$RESTORE_DIR/packages.list"
            |             apt-get dselect-upgrade -y
            |             echo "  ✓ Packages restored"
            |         else
            |             echo "  ⏭️  Skipped package restoration"
            |             echo "  💡 Packages list saved to: $RESTORE_DIR/packages.list"
            |         fi
            |     fi
            | 
            |     # Fix permissions
            |     echo ""
            |     echo "🔧 Fixing permissions..."
            |     for homedir in /home/*; do
            |         if [ -d "$homedir" ]; then
            |             username=$(basename "$homedir")
            |             chown -R "$username:$username" "$homedir" 2>/dev/null || true
            |         fi
            |     done
            | 
            |     # Cleanup
            |     rm -rf "$RESTORE_DIR"
            | 
            |     echo ""
            |     echo "${GREEN}✅ Restore complete!${NC}"
            |     echo ""
            |     echo "🎉 Your persistent data has been restored!"
            |     echo "🔄 Reboot to apply all changes"
            |     echo ""
            | }
            | 
            | auto_migrate() {
            |     echo "${GREEN}[AUTO-MIGRATE MODE]${NC}"
            |     echo ""
            |     echo "This will backup persistent data and restore to current system"
            |     echo "${YELLOW}⚠️  Make sure you're running on your FULL INSTALL${NC}"
            |     echo ""
            |     read -p "Continue? [y/N]: " -n 1 -r
            |     echo
            |     if [[ ! $REPLY =~ ^[Yy]$ ]]; then
            |         exit 0
            |     fi
            | 
            |     # Backup
            |     backup_persistent
            | 
            |     # Get the tarball that was just created
            |     LATEST_BACKUP=$(ls -t /tmp/furryos-backup-*.tar.gz 2>/dev/null | head -1)
            | 
            |     if [ -z "$LATEST_BACKUP" ]; then
            |         echo "${RED}❌ Backup failed${NC}"
            |         exit 1
            |     fi
            | 
            |     echo ""
            |     echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
            |     echo ""
            | 
            |     # Restore
            |     TARBALL="$LATEST_BACKUP"
            |     restore_to_install
            | }
            | 
            | main() {
            |     banner
            |     check_root
            |     mode_select
            | 
            |     case $mode in
            |         1)
            |             backup_persistent
            |             ;;
            |         2)
            |             restore_to_install
            |             ;;
            |         3)
            |             auto_migrate
            |             ;;
            |         *)
            |             echo "${RED}❌ Invalid choice${NC}"
            |             exit 1
            |             ;;
            |     esac
            | 
            |     echo ""
            |     echo "🐾 FurryOS Migration Tool - Done! 🌱"
            |     echo ""
            | }
            | 
            | main
            --- CONTENT END ---
        launcher.py
            --- CONTENT START ---
            | #!/usr/bin/env python3
            | """
            | ===============================================================================
            | FURRYOS MASTER LAUNCHER v8.2 "Madhatter"
            | ===============================================================================
            | """
            | import os
            | import sys
            | import subprocess
            | import shutil
            | from pathlib import Path
            | 
            | def find_api_key():
            |     """
            |     Intelligently hunts for the API key by walking up the directory tree.
            |     Works regardless of where this script is run from.
            |     """
            |     import os, sys
            |     filename = 'Gemini_API.key.txt'
            |     current_search_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up the tree (max 5 levels) to find the 'assets' folder
            |     for _ in range(5):
            |         potential_key = os.path.join(current_search_dir, 'assets', filename)
            |         if os.path.exists(potential_key):
            |             return potential_key
            |         
            |         # Move up one level
            |         parent_dir = os.path.dirname(current_search_dir)
            |         if parent_dir == current_search_dir: # We hit the root
            |             break
            |         current_search_dir = parent_dir
            |     
            |     # Fallback: Check Desktop
            |     desktop_fallback = os.path.expanduser('~/Desktop/Gemini_API.key.txt')
            |     if os.path.exists(desktop_fallback):
            |         return desktop_fallback
            | 
            |     print("❌ CRITICAL ERROR: Could not find 'Gemini_API.key.txt' anywhere.")
            |     sys.exit(1)
            | 
            | 
            | BUILD_DIR = Path("furryos_build")
            | BIN_DIR = BUILD_DIR / "bin"
            | ASSETS_DIR = Path("assets")
            | KEY_PATH = "signing_keys/furryos_signing.key"
            | 
            | # Replacing fragile API key loading with the robust function
            | API_KEY_FILE = find_api_key()
            | 
            | def run_cmd(cmd, desc):
            |     print(f"⚡ {desc}...")
            |     subprocess.run(cmd, shell=True, check=True)
            | 
            | def compile_heartbeat():
            |     print("\n[ Module: Heartbeat Core (C + ASM) ]")
            |     c_src = ASSETS_DIR / "heartbeat_core.c"
            |     asm_src = ASSETS_DIR / "heartbeat_core_asm.s"
            |     output_bin = BIN_DIR / "heartbeat_core"
            |     
            |     if shutil.which("nasm") and asm_src.exists():
            |         obj_asm = BUILD_DIR / "heartbeat_core_asm.o"
            |         run_cmd(f"nasm -f elf64 {asm_src} -o {obj_asm}", "Assembling x86_64 Core")
            |         # -no-pie is crucial for raw ASM integration
            |         run_cmd(f"gcc -O3 -march=native -pthread -no-pie {c_src} {obj_asm} -o {output_bin}", "Linking Core")
            |     else:
            |         print("⚠️  NASM missing or asm source missing.")
            | 
            | def compile_healer():
            |     print("\n[ Module: The Healer ]")
            |     healer_src = ASSETS_DIR / "healer_core.cpp"
            |     if not healer_src.exists():
            |         # Embed minimal healer just in case
            |         with open(healer_src, "w") as f:
            |             f.write(r'''#include <iostream>
            | #include <unistd.h>
            | #include <sys/wait.h>
            | #include <thread>
            | #include <chrono>
            | int main(int argc, char* argv[]) {
            |     if(argc<2)return 1;
            |     while(1){
            |         if(fork()==0) { execvp(argv[1],&argv[1]); exit(1); }
            |         int s; wait(&s); 
            |         std::this_thread::sleep_for(std::chrono::seconds(1));
            |     }
            | }''')
            |     run_cmd(f"g++ -O3 {healer_src} -o {BIN_DIR}/healer", "Compiling Healer")
            | 
            | def sign_binaries():
            |     print("\n[ Security: Signing Binaries ]")
            |     if not os.path.exists(KEY_PATH): return
            |     try:
            |         from cryptography.hazmat.primitives.asymmetric import ed25519
            |         from cryptography.hazmat.primitives import serialization
            |         with open(KEY_PATH, 'rb') as f:
            |             private_key = serialization.load_pem_private_key(f.read(), password=None)
            |         for binary in BIN_DIR.glob('*'):
            |             if binary.is_file() and not binary.suffix == '.sig':
            |                 with open(binary, 'rb') as f: data = f.read()
            |                 with open(f"{binary}.sig", 'wb') as f: f.write(private_key.sign(data))
            |                 print(f"   🔐 Signed: {binary.name}")
            |     except: pass
            | 
            | def main():
            |     if os.geteuid() != 0: sys.exit("❌ Run as root")
            |     for d in [BUILD_DIR, BIN_DIR]: d.mkdir(parents=True, exist_ok=True)
            |     compile_heartbeat()
            |     compile_healer()
            |     sign_binaries()
            |     print("\n✨ Binaries Ready.")
            | 
            | if __name__ == "__main__":
            |     main()
            --- CONTENT END ---
        compile_docs.py
            --- CONTENT START ---
            | import os
            | import datetime
            | import markdown
            | from xhtml2pdf import pisa
            | from pathlib import Path
            | 
            | def find_api_key():
            |     """
            |     Intelligently hunts for the API key by walking up the directory tree.
            |     Works regardless of where this script is run from.
            |     """
            |     import os, sys
            |     filename = 'Gemini_API.key.txt'
            |     current_search_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up the tree (max 5 levels) to find the 'assets' folder
            |     for _ in range(5):
            |         potential_key = os.path.join(current_search_dir, 'assets', filename)
            |         if os.path.exists(potential_key):
            |             return potential_key
            |         
            |         # Move up one level
            |         parent_dir = os.path.dirname(current_search_dir)
            |         if parent_dir == current_search_dir: # We hit the root
            |             break
            |         current_search_dir = parent_dir
            |     
            |     # Fallback: Check Desktop
            |     desktop_fallback = os.path.expanduser('~/Desktop/Gemini_API.key.txt')
            |     if os.path.exists(desktop_fallback):
            |         return desktop_fallback
            | 
            |     print("❌ CRITICAL ERROR: Could not find 'Gemini_API.key.txt' anywhere.")
            |     sys.exit(1)
            | 
            | 
            | # ==============================================================================
            | # CONFIGURATION
            | # ==============================================================================
            | OUTPUT_FILENAME = "FurryOS_Complete_Documentation.pdf"
            | VERSION = "8.0.0-origin"
            | TIMESTAMP = datetime.datetime.now().strftime("%Y-%m-%d %H:%M")
            | BRANDING = "Anthro Entertainment LLC"
            | 
            | SOURCE_DIRS = [".", "guides"]
            | EXTENSIONS = [".md", ".txt", ".yaml", ".json"]
            | 
            | EXCLUDE_FILES = [
            |     "requirements.txt", "MANIFEST.txt", "Gemini_API.key.txt",
            |     "compile_docs.py", "patch_furryos_optimized.py", "LICENSE",
            |     "create_partitions.py", "deploy_iso.py", "launcher.py"
            | ]
            | 
            | # ==============================================================================
            | # LOGICAL BOOK STRUCTURE
            | # ==============================================================================
            | BOOK_STRUCTURE = {
            |     "1. Overview": [
            |         "README.md",
            |         "ISO_README.txt",
            |         "VERSION_REFERENCE.md",
            |     ],
            |     "2. Configuration": [
            |         "GENOME.yaml",
            |         "USER_CONFIG.yaml",
            |     ],
            |     "3. Build System": [
            |         "FRESH_BUILD_GUIDE.md",
            |         "BUILD_OPTIONS.md",
            |         "BUILD_SUMMARY.md",
            |         "PROGRESS_FEATURES.md",
            |         "VENV_GUIDE.md",
            |     ],
            |     "4. Features & Usage": [
            |         "ANTHROHEART_INCLUSION_GUIDE.md",
            |         "PERSISTENCE_GUIDE.md",
            |         "SMART_PARTITION_GUIDE.md",
            |         "ETCHER_INCLUSION_GUIDE.md",
            |         "SIGNING_GUIDE.md",
            |     ],
            |     "5. Technical Reference": [
            |         "C_ASSEMBLY_OPTIMIZATION.md",
            |         "ASSEMBLY_OPTIMIZATION_PLAN.md",
            |         "FILE_ORGANIZATION.md",
            |         "PACKAGE_LIST.md",
            |     ],
            |     "6. Troubleshooting": [
            |         "UPDATE_INSTRUCTIONS.md",
            |         "USB_WRITING_GUIDE.md",
            |         "FIX_SUMMARY.md",
            |         "PEP668_FIX_GUIDE.md",
            |     ]
            | }
            | 
            | # ==============================================================================
            | # TEXT SANITIZER
            | # ==============================================================================
            | def sanitize_text(text):
            |     replacements = {
            |         "├──": "|--", "└──": "`--", "│": "|  ", "──": "--",
            |         "✅": "[OK] ", "❌": "[X] ", "⚠️": "[!] ", "🚀": ">> ",
            |         "🐾": "", "🌱": "", "✨": "* ", "🔒": "[SEC] ",
            |         "🔐": "[KEY] ", "📦": "[PKG] ", "📁": "[DIR] ",
            |         "📄": "[FILE] ", "🔧": "[TOOL] ", "🐛": "[BUG] ",
            |         "💡": "[TIP] ", "🎨": "[ART] ", "💾": "[DISK] ",
            |         "📊": "[STATS] ", "📝": "[NOTE] ", "👉": "-> ",
            |         "🎉": "!",
            |     }
            |     for char, replacement in replacements.items():
            |         text = text.replace(char, replacement)
            |     return text
            | 
            | # ==============================================================================
            | # CSS STYLING
            | # ==============================================================================
            | CSS = """
            |     @page {
            |         size: letter;
            |         margin: 0.75in;
            |         margin-bottom: 1.2in;
            |         @frame footer_frame {
            |             -pdf-frame-content: footerContent;
            |             bottom: 0.5in;
            |             margin-left: 0.75in;
            |             margin-right: 0.75in;
            |             height: 0.5in;
            |         }
            |     }
            |     body {
            |         font-family: Helvetica, sans-serif;
            |         font-size: 10pt;
            |         line-height: 1.4;
            |         color: #222;
            |     }
            | 
            |     /* Footer */
            |     #footerContent {
            |         text-align: center;
            |         font-size: 8pt;
            |         color: #888;
            |         border-top: 1px solid #ccc;
            |         padding-top: 5px;
            |     }
            | 
            |     /* Headers */
            |     h1 { color: #E85D04; border-bottom: 2px solid #333; padding-bottom: 5px; margin-top: 0px; font-size: 18pt; }
            |     h2 { color: #333; margin-top: 20px; font-size: 14pt; border-bottom: 1px solid #ddd; }
            |     h3 { color: #555; font-size: 12pt; margin-top: 15px; font-weight: bold; }
            | 
            |     /* Document Title Headers */
            |     h1.doc-title {
            |         background-color: #333;
            |         color: #fff;
            |         padding: 5px 10px;
            |         font-family: Courier, monospace;
            |         font-size: 11pt;
            |         margin-bottom: 20px;
            |         border-radius: 3px;
            |         page-break-after: avoid;
            |     }
            | 
            |     h1.section-title {
            |         color: #E85D04;
            |         font-size: 24pt;
            |         text-align: center;
            |         margin-top: 200px;
            |         page-break-after: always;
            |     }
            | 
            |     /* Code Blocks */
            |     pre {
            |         font-family: 'Courier New', Courier, monospace;
            |         background-color: #f4f4f4;
            |         color: #000;
            |         padding: 8px;
            |         border: 1px solid #ccc;
            |         border-radius: 4px;
            |         font-size: 7pt;
            |         line-height: 1.2;
            |         white-space: pre;
            |         overflow: hidden;
            |         display: block;
            |         margin-bottom: 15px;
            |     }
            |     code { font-family: 'Courier New', Courier, monospace; background-color: #eee; padding: 2px 4px; font-size: 9pt; }
            | 
            |     /* Page Breaks */
            |     .file-break { page-break-before: always; }
            | 
            |     /* Cover Page */
            |     .cover-page { text-align: center; margin-top: 100px; page-break-after: always; }
            |     .cover-title { font-size: 36pt; font-weight: bold; color: #E85D04; margin-top: 20px; }
            | 
            |     /* Doc Control Table */
            |     .doc-control { margin-top: 50px; width: 100%; border-collapse: collapse; }
            |     .doc-control td { border: 1px solid #ddd; padding: 8px; font-size: 9pt; }
            |     .doc-control th { background-color: #eee; border: 1px solid #ddd; padding: 8px; font-size: 9pt; text-align: left; }
            | """
            | 
            | def get_file_content(filepath):
            |     try:
            |         with open(filepath, "r", encoding="utf-8") as f:
            |             text = f.read()
            |     except Exception:
            |         return ""
            | 
            |     text = sanitize_text(text)
            |     ext = os.path.splitext(filepath)[1].lower()
            |     filename = os.path.basename(filepath)
            | 
            |     html = f"<div class='file-break'><h1 class='doc-title'>{filename}</h1>"
            | 
            |     if ext == ".md":
            |         try:
            |             html += markdown.markdown(text, extensions=['extra', 'codehilite'])
            |         except Exception:
            |             html += f"<pre>{text}</pre>"
            |     else:
            |         html += f"<pre>{text}</pre>"
            | 
            |     html += "</div>"
            |     return html
            | 
            | def find_file_path(filename):
            |     for d in SOURCE_DIRS:
            |         possible_path = os.path.join(d, filename)
            |         if os.path.exists(possible_path) and os.path.isfile(possible_path):
            |             return possible_path
            |     return None
            | 
            | def collect_appendix_files(processed_files):
            |     appendix = []
            |     for d in SOURCE_DIRS:
            |         if not os.path.exists(d): continue
            |         if d == ".":
            |             candidates = [f for f in os.listdir(d) if os.path.isfile(f)]
            |         else:
            |             candidates = []
            |             for root, _, files in os.walk(d):
            |                 for f in files: candidates.append(os.path.join(root, f))
            | 
            |         for f_path in candidates:
            |             fname = os.path.basename(f_path)
            |             if fname in processed_files or fname in EXCLUDE_FILES or fname.startswith('.'):
            |                 continue
            |             if os.path.splitext(fname)[1].lower() in EXTENSIONS:
            |                 full_path = f_path if d == "." else f_path
            |                 appendix.append(full_path)
            |     appendix.sort()
            |     return appendix
            | 
            | def get_logo_html():
            |     """Finds logo.png or icon.png and returns HTML image tag."""
            |     possible_logos = ["images/logo.png", "images/icon.png"]
            |     logo_path = None
            | 
            |     for rel_path in possible_logos:
            |         full_path = os.path.abspath(rel_path)
            |         if os.path.exists(full_path):
            |             logo_path = full_path
            |             print(f"🖼️  Found branding image: {rel_path}")
            |             break
            | 
            |     if logo_path:
            |         # 250px width ensures it fits nicely on the page
            |         return f'<img src="{logo_path}" style="width: 250px; height: auto; margin-bottom: 20px;" />'
            |     return ""
            | 
            | def generate_pdf():
            |     print("🐾 FurryOS Docs Compiler (Branded) 🐾")
            | 
            |     content_html = ""
            |     processed_filenames = set()
            | 
            |     # --- Process Sections ---
            |     for section_name, files in BOOK_STRUCTURE.items():
            |         print(f"📘 Processing Section: {section_name}")
            |         for filename in files:
            |             full_path = find_file_path(filename)
            |             if full_path:
            |                 print(f"   + {filename}")
            |                 content_html += get_file_content(full_path)
            |                 processed_filenames.add(filename)
            |             else:
            |                 print(f"   ⚠️  Missing: {filename}")
            | 
            |     # --- Process Appendix ---
            |     appendix_files = collect_appendix_files(processed_filenames)
            |     if appendix_files:
            |         print("📎 Processing Appendix...")
            |         content_html += "<div class='file-break'><h1 class='doc-title'>Appendix</h1></div>"
            |         for full_path in appendix_files:
            |             filename = os.path.basename(full_path)
            |             print(f"   + {filename}")
            |             content_html += get_file_content(full_path)
            | 
            |     # --- Build Final HTML ---
            |     logo_html = get_logo_html()
            | 
            |     full_html = f"""
            |     <html>
            |     <head><style>{CSS}</style></head>
            |     <body>
            |         <div id="footerContent">FurryOS {VERSION} — <pdf:pagenumber></div>
            | 
            |         <!-- COVER PAGE -->
            |         <div class="cover-page">
            |             {logo_html}
            |             <div class="cover-title">FurryOS</div>
            |             <div style="font-size: 24pt; color: #555;">Complete Documentation</div>
            | 
            |             <div style="margin-top: 50px; color: #888;">Generated: {TIMESTAMP}</div>
            |             <div style="margin-top: 20px; font-size: 14pt; color: #333; font-weight: bold;">{BRANDING}</div>
            | 
            |             <!-- Document Control Table -->
            |             <br><br><br>
            |             <table class="doc-control" align="center" style="width: 80%;">
            |                 <tr><th>Version</th><td>{VERSION}</td></tr>
            |                 <tr><th>Status</th><td>Origin Release</td></tr>
            |                 <tr><th>Codename</th><td>Sovereign Universe</td></tr>
            |                 <tr><th>License</th><td>MIT License (Public)</td></tr>
            |             </table>
            |         </div>
            | 
            |         <!-- INDEX -->
            |         <div class="file-break">
            |             <h1 style="color: #333; border: none;">Table of Contents</h1>
            |             <pdf:toc />
            |         </div>
            | 
            |         <!-- CONTENT -->
            |         {content_html}
            |     </body></html>
            |     """
            | 
            |     print(f"✍️  Writing PDF to {OUTPUT_FILENAME}...")
            |     try:
            |         with open(OUTPUT_FILENAME, "wb") as output_file:
            |             pisa_status = pisa.CreatePDF(src=full_html, dest=output_file)
            |         if not pisa_status.err:
            |             print(f"✅ Success! PDF saved to: {os.path.abspath(OUTPUT_FILENAME)}")
            |         else:
            |             print("❌ Error generating PDF")
            |     except Exception as e:
            |         print(f"❌ Critical Error: {e}")
            | 
            | if __name__ == "__main__":
            |     generate_pdf()
            --- CONTENT END ---
        quick_start.sh
            --- CONTENT START ---
            | #!/bin/bash
            | # FurryOS Quick Start - Golden State Edition
            | set -e
            | 
            | echo "==============================================================================="
            | echo "   🐾 FURRYOS GOLDEN STATE BUILDER 🐾"
            | echo "==============================================================================="
            | echo "   1. CLEANUP: Wiping old artifacts"
            | echo "   2. SETUP: Creating environment"
            | echo "   3. DOCS: Generating PDF Guide"
            | echo "   4. COMPILE: Building C/ASM core"
            | echo "   5. DEPLOY: Building ISO with Library & Docs"
            | echo "==============================================================================="
            | 
            | # --- STEP 1: CLEAN ---
            | if [ -d "furryos_venv" ]; then rm -rf furryos_venv; fi
            | if [ -d "output" ]; then rm -rf output; fi
            | if [ -d "furryos_build" ]; then rm -rf furryos_build; fi
            | 
            | # --- STEP 2: SETUP ---
            | echo ""
            | echo "🔍 [2/6] CHECKING DEPENDENCIES..."
            | sudo apt-get update -qq
            | sudo apt-get install -y python3 python3-pip python3-venv build-essential nasm gcc g++ genisoimage xorriso mtools grub-pc-bin grub-efi-amd64-bin parted dosfstools rsync
            | 
            | echo ""
            | echo "🐍 [3/6] SETTING UP VENV..."
            | chmod +x setup_venv.sh
            | ./setup_venv.sh
            | source furryos_venv/bin/activate
            | 
            | # --- STEP 3: DOCS ---
            | echo ""
            | echo "📘 [3.5/6] COMPILING USER GUIDE..."
            | # Ensure PDF deps are installed in venv (xhtml2pdf)
            | pip install -q xhtml2pdf markdown
            | python3 compile_docs.py
            | 
            | # --- STEP 4: KEYS & COMPILE ---
            | echo ""
            | echo "🔐 [4/6] CHECKING KEYS..."
            | if [ ! -f "signing_keys/furryos_signing.key" ]; then
            |     python3 assets/generate_signing_keys.py
            | fi
            | 
            | echo ""
            | echo "🔨 [5/6] COMPILING MODULES..."
            | sudo furryos_venv/bin/python3 assets/launcher.py
            | 
            | # --- STEP 5: BUILD ---
            | echo ""
            | echo "💿 [6/6] BUILDING ISO..."
            | sudo furryos_venv/bin/python3 assets/deploy_iso.py
            | 
            | # --- DONE ---
            | echo ""
            | if ls output/furryos-*.iso 1> /dev/null 2>&1; then
            |     ISO_FILE=$(ls output/furryos-*.iso | head -n 1)
            |     echo "📀 ISO Created: $ISO_FILE"
            |     echo "   Next: sudo python3 assets/create_partitions.py"
            | else
            |     echo "❌ Error: ISO generation failed."
            |     exit 1
            | fi
            --- CONTENT END ---
        generate_compiler.py
            --- CONTENT START ---
            | import os
            | 
            | ROOT_DIR = os.getcwd()
            | 
            | def generate_compiler_suite():
            |     print("🛠️  Injecting Cross-Compiler Suite for Debian 13 -> Windows/Linux...")
            |     
            |     base_path = os.path.join(ROOT_DIR, 'build_system')
            |     if not os.path.exists(base_path):
            |         os.makedirs(base_path)
            | 
            |     # 1. SETUP SCRIPT
            |     setup_sh = """#!/bin/bash
            | # Installs the Cross-Compiler toolchain on Debian 13
            | echo "🔧 Installing MinGW-w64 (Windows Compiler) and NASM (Assembler)..."
            | sudo apt-get update
            | sudo apt-get install -y build-essential mingw-w64 nasm make
            | echo "✅ Toolchain installed."
            | """
            |     with open(os.path.join(base_path, 'install_toolchain.sh'), 'w') as f:
            |         f.write(setup_sh)
            |     os.chmod(os.path.join(base_path, 'install_toolchain.sh'), 0o755)
            | 
            |     # 2. MAKEFILE
            |     makefile = """
            | CC_LINUX = gcc
            | CC_WIN = x86_64-w64-mingw32-gcc
            | ASM = nasm
            | CFLAGS = -Wall -O2
            | WIN_GUI_FLAGS = -mwindows
            | 
            | all: linux_cli.elf windows_cli.exe windows_gui.exe
            | 
            | linux_cli.elf: core_logic.o wrapper_cli.c
            | \t$(CC_LINUX) $(CFLAGS) wrapper_cli.c core_logic.o -o bin/app_linux
            | 
            | windows_cli.exe: core_logic.obj wrapper_cli.c
            | \t$(CC_WIN) $(CFLAGS) wrapper_cli.c core_logic.obj -o bin/app_console.exe
            | 
            | windows_gui.exe: core_logic.obj wrapper_gui.c
            | \t$(CC_WIN) $(CFLAGS) $(WIN_GUI_FLAGS) wrapper_gui.c core_logic.obj -o bin/app_gui.exe
            | 
            | core_logic.o: core_logic.asm
            | \t$(ASM) -f elf64 core_logic.asm -o core_logic.o
            | 
            | core_logic.obj: core_logic.asm
            | \t$(ASM) -f win64 core_logic.asm -o core_logic.obj
            | 
            | clean:
            | \trm -f *.o *.obj bin/*
            | """
            |     with open(os.path.join(base_path, 'Makefile'), 'w') as f:
            |         f.write(makefile)
            | 
            |     # 3. ASM CORE
            |     asm_code = """
            | global get_magic_number
            | section .text
            | get_magic_number:
            |     mov rax, 42
            |     ret
            | """
            |     with open(os.path.join(base_path, 'core_logic.asm'), 'w') as f:
            |         f.write(asm_code)
            | 
            |     # 4. C CLI WRAPPER
            |     cli_code = """
            | #include <stdio.h>
            | extern int get_magic_number();
            | int main() {
            |     printf("FurryOS CLI Wrapper\\nMagic number from ASM: %d\\n", get_magic_number());
            |     return 0;
            | }
            | """
            |     with open(os.path.join(base_path, 'wrapper_cli.c'), 'w') as f:
            |         f.write(cli_code)
            | 
            |     # 5. C GUI WRAPPER
            |     gui_code = """
            | #include <windows.h>
            | #include <stdio.h>
            | extern int get_magic_number();
            | int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nCmdShow) {
            |     char buffer[100];
            |     sprintf(buffer, "FurryOS Native GUI\\n\\nData from ASM Core: %d", get_magic_number());
            |     MessageBox(NULL, buffer, "FurryOS App", MB_OK | MB_ICONINFORMATION);
            |     return 0;
            | }
            | """
            |     with open(os.path.join(base_path, 'wrapper_gui.c'), 'w') as f:
            |         f.write(gui_code)
            | 
            |     # Create bin directory
            |     if not os.path.exists(os.path.join(base_path, 'bin')):
            |         os.makedirs(os.path.join(base_path, 'bin'))
            |     
            |     print("✅ Compiler Suite Generated in '/build_system'")
            | 
            | if __name__ == "__main__":
            |     generate_compiler_suite()
            --- CONTENT END ---
        notarize_anthroheart.py
            --- CONTENT START ---
            | #!/usr/bin/env python3
            | """
            | ===============================================================================
            |  The AnthroHeart Collection – Genesis Notarization (2025)
            | ===============================================================================
            | Date:   December 25, 2025
            | Author: AnthroHeart Project
            | Organization: Anthro Entertainment LLC
            | Type:   Genesis Record (Sequence 0)
            | ===============================================================================
            | """
            | 
            | import os
            | import sys
            | import subprocess
            | import platform
            | import shutil
            | from datetime import datetime, timezone
            | 
            | # ---------------------------------------------------------------------------
            | # 1. VENV BOOTSTRAP (Cross-Platform & Self-Healing)
            | # ---------------------------------------------------------------------------
            | 
            | VENV_DIR = ".venv"
            | REQUIRED_PACKAGES = ["pynacl", "opentimestamps-client"]
            | 
            | def in_venv():
            |     return sys.prefix != sys.base_prefix
            | 
            | def bootstrap_venv():
            |     # Detect Python executable for the venv
            |     if platform.system() == "Windows":
            |         venv_python = os.path.join(VENV_DIR, "Scripts", "python.exe")
            |     else:
            |         venv_python = os.path.join(VENV_DIR, "bin", "python")
            | 
            |     if not os.path.exists(VENV_DIR):
            |         print(f"🔧 Creating virtual environment in {VENV_DIR}...")
            |         subprocess.check_call([sys.executable, "-m", "venv", VENV_DIR])
            | 
            |     # If we are not in the venv, install deps and relaunch
            |     if not in_venv():
            |         print("📦 Verifying dependencies...")
            |         subprocess.check_call([venv_python, "-m", "pip", "install", "--upgrade", "pip", "-q"])
            |         subprocess.check_call([venv_python, "-m", "pip", "install", *REQUIRED_PACKAGES, "-q"])
            |         
            |         print("🔁 Re-launching script inside venv...\n")
            |         subprocess.check_call([venv_python] + sys.argv)
            |         sys.exit(0)
            | 
            | if not in_venv():
            |     bootstrap_venv()
            | 
            | # ---------------------------------------------------------------------------
            | # 2. CORE LOGIC
            | # ---------------------------------------------------------------------------
            | 
            | import json
            | import hashlib
            | from nacl.signing import SigningKey
            | from nacl.encoding import HexEncoder
            | 
            | # CONFIGURATION
            | FILE_PATH = "The_AnthroHeart_Collection_Bundle.7z"
            | OUT_DIR = "anthroheart_chain"
            | KEY_FILE = "anthroheart_genesis.key"
            | 
            | METADATA = {
            |     "author": "AnthroHeart Project",
            |     "organization": "Anthro Entertainment LLC",
            |     "license": "CC0-1.0",
            |     "website": "https://anthroentertainment.com",
            |     "publication_date": "2025-12-25",
            |     "notes": "Genesis record. Hash-only notarization anchored via OpenTimestamps."
            | }
            | 
            | def get_hashes(filepath):
            |     print(f"⚙️  Hashing {filepath} (SHA256 & SHA512)...")
            |     sha256 = hashlib.sha256()
            |     sha512 = hashlib.sha512()
            |     
            |     total_size = os.path.getsize(filepath)
            |     processed = 0
            |     
            |     with open(filepath, "rb") as f:
            |         while chunk := f.read(4096 * 4096): # 16MB chunks
            |             sha256.update(chunk)
            |             sha512.update(chunk)
            |             processed += len(chunk)
            |             if total_size > 0:
            |                 print(f"   Progress: {int((processed/total_size)*100)}%", end="\r")
            |             
            |     print("\n   Hashing complete.")
            |     return sha256.hexdigest(), sha512.hexdigest()
            | 
            | def load_or_generate_key(key_path):
            |     if os.path.exists(key_path):
            |         print(f"🔑 Loading existing identity from {key_path}...")
            |         with open(key_path, "r") as f:
            |             private_hex = f.read().strip()
            |         return SigningKey(private_hex, encoder=HexEncoder)
            |     else:
            |         print(f"✨ Generating NEW Genesis Identity...")
            |         signing_key = SigningKey.generate()
            |         private_hex = signing_key.encode(encoder=HexEncoder).decode()
            |         with open(key_path, "w") as f:
            |             f.write(private_hex)
            |         print(f"⚠️  WARNING: {key_path} created. BACK THIS UP. You need it to sign future updates.")
            |         return signing_key
            | 
            | def main():
            |     if not os.path.exists(FILE_PATH):
            |         print(f"❌ Error: Archive not found: {FILE_PATH}")
            |         sys.exit(1)
            | 
            |     os.makedirs(OUT_DIR, exist_ok=True)
            | 
            |     # 1. Manage Identity
            |     signing_key = load_or_generate_key(os.path.join(OUT_DIR, KEY_FILE))
            |     verify_key = signing_key.verify_key
            |     public_key_hex = verify_key.encode(encoder=HexEncoder).decode()
            | 
            |     # Save public key for verification
            |     with open(os.path.join(OUT_DIR, "anthro_public.key"), "w") as f:
            |         f.write(public_key_hex)
            | 
            |     # 2. Hash Content
            |     h256, h512 = get_hashes(FILE_PATH)
            | 
            |     # 3. Sign Hashes
            |     signature = signing_key.sign(h512.encode()).signature.hex()
            | 
            |     # 4. Construct Genesis Metadata
            |     record = {
            |         "manifest_version": "1.0",
            |         "sequence": 0,
            |         "previous_record": None,
            |         # Fixed the deprecated datetime warning
            |         "timestamp_claim": datetime.now(timezone.utc).isoformat(),
            |         
            |         "file_info": {
            |             "filename": FILE_PATH,
            |             "size_bytes": os.path.getsize(FILE_PATH),
            |             "hashes": {
            |                 "sha256": h256,
            |                 "sha512": h512
            |             }
            |         },
            |         "identity": {
            |             "public_key": public_key_hex,
            |             "signature_of_sha512": signature,
            |             "algorithm": "Ed25519"
            |         },
            |         "metadata": METADATA
            |     }
            | 
            |     # 5. Save Record
            |     record_filename = f"genesis_record_{METADATA['publication_date']}.json"
            |     record_path = os.path.join(OUT_DIR, record_filename)
            | 
            |     with open(record_path, "w") as f:
            |         json.dump(record, f, indent=2, sort_keys=True)
            | 
            |     # 6. Timestamp (FIXED)
            |     print(f"⏳ Submitting to Bitcoin blockchain via OpenTimestamps...")
            |     
            |     # Calculate path to the 'ots' executable inside the venv to avoid PATH errors
            |     if platform.system() == "Windows":
            |         ots_executable = os.path.join(sys.prefix, "Scripts", "ots.exe")
            |     else:
            |         ots_executable = os.path.join(sys.prefix, "bin", "ots")
            | 
            |     try:
            |         if not os.path.exists(ots_executable):
            |             # Fallback for some linux distros if they put it elsewhere in venv
            |             ots_executable = "ots" 
            |             
            |         subprocess.check_call([ots_executable, "stamp", record_path])
            |         print("✅ Timestamp proof created successfully.")
            |         
            |         # 7. Final Report
            |         print("\n" + "="*60)
            |         print(" ANTHROHEART COLLECTION - GENESIS COMPLETE")
            |         print("="*60)
            |         print(f"📂 Output Directory: {OUT_DIR}/")
            |         print(f"📄 Genesis Record:   {record_filename}")
            |         print(f"🛡️  Proof File:       {record_filename}.ots")
            |         print(f"🔑 Identity Key:     {KEY_FILE} (DO NOT SHARE/LOSE THIS)")
            |         print("-" * 60)
            |         print("Next Steps:")
            |         print("1. Keep the .ots file safe. It is your proof.")
            |         print("2. When you upgrade this collection, increment 'sequence' to 1")
            |         print("   and hash this genesis JSON as 'previous_record'.")
            |         print("="*60)
            |         
            |     except subprocess.CalledProcessError:
            |         print("❌ Error: OpenTimestamps server returned an error.")
            |         print("   Wait a moment and try running manually: 'ots stamp <file>'")
            |     except FileNotFoundError:
            |         print(f"❌ Error: Could not find 'ots' executable at {ots_executable}")
            |         print("   Please ensure the requirements installed correctly.")
            | 
            | if __name__ == "__main__":
            |     main()
            --- CONTENT END ---
        create_partitions.py
            --- CONTENT START ---
            | #!/usr/bin/env python3
            | """
            | ===============================================================================
            | FURRYOS SMART USB CREATOR (ROBUST)
            | ===============================================================================
            | """
            | import os
            | import sys
            | import subprocess
            | import time
            | from pathlib import Path
            | 
            | def run_cmd(cmd, ignore=False):
            |     try:
            |         subprocess.run(cmd, shell=True, check=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
            |     except:
            |         if not ignore: print(f"Error: {cmd}")
            | 
            | def aggressive_unmount(dev):
            |     print(f"   🔓 Unmounting {dev}...")
            |     run_cmd(f"umount -f {dev}*", ignore=True)
            |     run_cmd(f"swapoff {dev}*", ignore=True)
            |     time.sleep(1)
            | 
            | def main():
            |     if os.geteuid() != 0: sys.exit("❌ Run as root")
            |     print("📀 Available Devices:")
            |     subprocess.run("lsblk -d -o NAME,SIZE,MODEL", shell=True)
            |     
            |     dev = input("\nTarget Device (e.g. sdb): ").strip()
            |     if not dev.startswith("/dev/"): dev = f"/dev/{dev}"
            |     
            |     if input(f"⚠️  ERASE {dev}? [y/N]: ") != "y": sys.exit()
            |     
            |     print("🔧 Partitioning...")
            |     aggressive_unmount(dev)
            |     run_cmd(f"wipefs -a {dev}")
            |     run_cmd(f"parted -s {dev} mklabel gpt")
            |     run_cmd(f"parted -s {dev} mkpart primary 1MiB 2MiB")
            |     run_cmd(f"parted -s {dev} set 1 bios_grub on")
            |     run_cmd(f"parted -s {dev} mkpart primary fat32 2MiB 514MiB")
            |     run_cmd(f"parted -s {dev} set 2 esp on")
            |     run_cmd(f"parted -s {dev} mkpart primary ext4 514MiB 100%")
            |     
            |     run_cmd(f"partprobe {dev}")
            |     time.sleep(2)
            |     
            |     print("💾 Formatting...")
            |     aggressive_unmount(dev)
            |     p = "p" if dev[-1].isdigit() else ""
            |     run_cmd(f"mkfs.vfat -F32 -n FURRY_EFI {dev}{p}2")
            |     run_cmd(f"mkfs.ext4 -F -L FURRY_ROOT {dev}{p}3")
            |     
            |     print("🔥 Installing Bootloader...")
            |     mnt = "/mnt/furry_install"
            |     run_cmd(f"mkdir -p {mnt}")
            |     run_cmd(f"mount {dev}{p}3 {mnt}")
            |     run_cmd(f"mkdir -p {mnt}/boot/efi")
            |     run_cmd(f"mount {dev}{p}2 {mnt}/boot/efi")
            |     
            |     iso = list(Path("output").glob("furryos-*.iso"))
            |     if iso:
            |         print(f"   Extracting {iso[0].name}...")
            |         subprocess.run(f"xorriso -osirrox on -indev {iso[0]} -extract / {mnt}", shell=True)
            |     
            |     try:
            |         subprocess.run(f"grub-install --target=x86_64-efi --efi-directory={mnt}/boot/efi --boot-directory={mnt}/boot --removable --recheck", shell=True)
            |         subprocess.run(f"grub-install --target=i386-pc --boot-directory={mnt}/boot --recheck {dev}", shell=True)
            |     except: pass
            |     
            |     run_cmd(f"umount -R {mnt}")
            |     print("✅ DONE! USB Ready.")
            | 
            | if __name__ == "__main__":
            |     main()
            --- CONTENT END ---
        generate_signing_keys.py
            --- CONTENT START ---
            | #!/usr/bin/env python3
            | """
            | ===============================================================================
            | FURRYOS SIGNING KEY GENERATOR
            | ===============================================================================
            | Location: /TOP/assets/generate_signing_keys.py
            | Generates Ed25519 keypair for self-signing all FurryOS binaries
            | Keys are saved in /TOP/signing_keys/
            | ===============================================================================
            | """
            | 
            | import os
            | import sys
            | from pathlib import Path
            | from datetime import datetime
            | 
            | def find_api_key():
            |     """
            |     Intelligently hunts for the API key by walking up the directory tree.
            |     Works regardless of where this script is run from.
            |     """
            |     import os, sys
            |     filename = 'Gemini_API.key.txt'
            |     current_search_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up the tree (max 5 levels) to find the 'assets' folder
            |     for _ in range(5):
            |         potential_key = os.path.join(current_search_dir, 'assets', filename)
            |         if os.path.exists(potential_key):
            |             return potential_key
            |         
            |         # Move up one level
            |         parent_dir = os.path.dirname(current_search_dir)
            |         if parent_dir == current_search_dir: # We hit the root
            |             break
            |         current_search_dir = parent_dir
            |     
            |     # Fallback: Check Desktop
            |     desktop_fallback = os.path.expanduser('~/Desktop/Gemini_API.key.txt')
            |     if os.path.exists(desktop_fallback):
            |         return desktop_fallback
            | 
            |     print("❌ CRITICAL ERROR: Could not find 'Gemini_API.key.txt' anywhere.")
            |     sys.exit(1)
            | 
            | 
            | def banner():
            |     print("\n" + "="*80)
            |     print("   🔐 FURRYOS SIGNING KEY GENERATOR 🔐")
            |     print("="*80 + "\n")
            | 
            | def check_cryptography():
            |     """Check if cryptography is available"""
            |     try:
            |         import cryptography
            |         return True
            |     except ImportError:
            |         print("❌ cryptography package not found")
            |         print("")
            |         print("Please install it:")
            |         print("   Option 1 (venv): source activate_furryos.sh")
            |         print("   Option 2 (system): sudo pip3 install cryptography")
            |         print("   Option 3 (venv): ./setup_venv.sh")
            |         print("")
            |         return False
            | 
            | def generate_keypair():
            |     """Generate Ed25519 signing keypair"""
            |     from cryptography.hazmat.primitives.asymmetric import ed25519
            | 
            |     print("🔑 Generating Ed25519 keypair...")
            |     private_key = ed25519.Ed25519PrivateKey.generate()
            |     public_key = private_key.public_key()
            | 
            |     return private_key, public_key
            | 
            | def save_keys(private_key, public_key, key_dir):
            |     """Save keys to disk"""
            |     from cryptography.hazmat.primitives import serialization
            | 
            |     Path(key_dir).mkdir(parents=True, exist_ok=True)
            | 
            |     # Save private key
            |     private_pem = private_key.private_bytes(
            |         encoding=serialization.Encoding.PEM,
            |         format=serialization.PrivateFormat.PKCS8,
            |         encryption_algorithm=serialization.NoEncryption()
            |     )
            | 
            |     private_path = f"{key_dir}/furryos_signing.key"
            |     with open(private_path, 'wb') as f:
            |         f.write(private_pem)
            |     os.chmod(private_path, 0o600)
            |     print(f"✓ Private key: {private_path}")
            | 
            |     # Save public key
            |     public_pem = public_key.public_bytes(
            |         encoding=serialization.Encoding.PEM,
            |         format=serialization.PublicFormat.SubjectPublicKeyInfo
            |     )
            | 
            |     public_path = f"{key_dir}/furryos_signing.pub"
            |     with open(public_path, 'wb') as f:
            |         f.write(public_pem)
            |     os.chmod(public_path, 0o644)
            |     print(f"✓ Public key: {public_path}")
            | 
            |     # Save metadata
            |     metadata = f"""# FurryOS Signing Keys
            | Generated: {datetime.now().strftime('%Y-%m-%d %H:%M:%S UTC')}
            | Algorithm: Ed25519
            | Purpose: Self-signing FurryOS binaries
            | 
            | ## Private Key
            | File: furryos_signing.key
            | Permissions: 0600 (owner read/write only)
            | DO NOT SHARE THIS FILE!
            | 
            | ## Public Key
            | File: furryos_signing.pub
            | Permissions: 0644 (world readable)
            | Distribute with binaries for verification
            | 
            | ## Usage
            | 
            | ### Sign a binary:
            | python3 assets/sign_binary.py furryos_build/bin/heartbeat_core
            | 
            | ### Verify a signature:
            | python3 assets/verify_signature.py furryos_build/bin/heartbeat_core
            | """
            | 
            |     with open(f"{key_dir}/README.txt", 'w') as f:
            |         f.write(metadata)
            | 
            |     print(f"✓ Metadata: {key_dir}/README.txt")
            | 
            | def main():
            |     banner()
            | 
            |     # Change to /TOP directory (parent of assets/)
            |     script_dir = os.path.dirname(os.path.abspath(__file__))
            |     if script_dir.endswith('/assets'):
            |         os.chdir(os.path.dirname(script_dir))
            | 
            |     # Check cryptography
            |     if not check_cryptography():
            |         sys.exit(1)
            | 
            |     KEY_DIR = "signing_keys"
            | 
            |     if os.path.exists(f"{KEY_DIR}/furryos_signing.key"):
            |         print(f"⚠️  Keys already exist in {KEY_DIR}/")
            |         response = input("Regenerate? [y/N]: ")
            |         if response.lower() != 'y':
            |             print("✓ Keeping existing keys")
            |             return
            | 
            |     private_key, public_key = generate_keypair()
            |     print(f"\n💾 Saving keys to {KEY_DIR}/")
            |     save_keys(private_key, public_key, KEY_DIR)
            | 
            |     print("\n" + "="*80)
            |     print("   🎉 SIGNING KEYS GENERATED! 🎉")
            |     print("="*80)
            |     print(f"\n📁 Location: {KEY_DIR}/")
            |     print("\n🔐 furryos_signing.key (PRIVATE - KEEP SECRET!)")
            |     print("🔓 furryos_signing.pub (PUBLIC - DISTRIBUTE)")
            |     print("📝 README.txt (usage instructions)")
            |     print("\n🔒 Security:")
            |     print("   • Private key permissions: 0600 (owner only)")
            |     print("   • Public key permissions: 0644 (world readable)")
            |     print("   • Algorithm: Ed25519 (256-bit security)")
            |     print("\n🐾 Keys ready for signing binaries! 🌱\n")
            | 
            | if __name__ == "__main__":
            |     main()
            --- CONTENT END ---
        generate_manifest.py
            --- CONTENT START ---
            | import os
            | 
            | def find_api_key():
            |     """
            |     Intelligently hunts for the API key by walking up the directory tree.
            |     Works regardless of where this script is run from.
            |     """
            |     import os, sys
            |     filename = 'Gemini_API.key.txt'
            |     current_search_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up the tree (max 5 levels) to find the 'assets' folder
            |     for _ in range(5):
            |         potential_key = os.path.join(current_search_dir, 'assets', filename)
            |         if os.path.exists(potential_key):
            |             return potential_key
            |         
            |         # Move up one level
            |         parent_dir = os.path.dirname(current_search_dir)
            |         if parent_dir == current_search_dir: # We hit the root
            |             break
            |         current_search_dir = parent_dir
            |     
            |     # Fallback: Check Desktop
            |     desktop_fallback = os.path.expanduser('~/Desktop/Gemini_API.key.txt')
            |     if os.path.exists(desktop_fallback):
            |         return desktop_fallback
            | 
            |     print("❌ CRITICAL ERROR: Could not find 'Gemini_API.key.txt' anywhere.")
            |     sys.exit(1)
            | 
            | # Configuration: Folders to scan relative to where the script is run
            | DIRS_TO_SCAN = [
            |     ".",              # Top level
            |     "assets",         # /TOP/assets
            |     "guides",         # /TOP/guides
            | ]
            | 
            | # Configuration: File extensions and specific filenames to include
            | # We exclude binary files like .png to avoid encoding errors
            | VALID_EXTENSIONS = (
            |     ".py", ".sh", ".c", ".s", ".yaml", ".txt", ".md", ".json"
            | )
            | EXACT_FILES = [
            |     "Makefile_optimized",
            |     "requirements.txt",
            |     "Dockerfile" # Just in case
            | ]
            | 
            | OUTPUT_FILE = "MANIFEST.txt"
            | 
            | def should_include(filename):
            |     """Checks if a file matches our extensions or exact name lists."""
            |     if filename in EXACT_FILES:
            |         return True
            |     return filename.endswith(VALID_EXTENSIONS) and not filename.startswith("MANIFEST")
            | 
            | def generate_manifest():
            |     print(f"Generating {OUTPUT_FILE}...")
            |     
            |     with open(OUTPUT_FILE, "w", encoding="utf-8") as manifest:
            |         # Write a header
            |         manifest.write("FurryOS Source Manifest\n")
            |         manifest.write("=======================\n\n")
            | 
            |         total_files = 0
            |         
            |         for directory in DIRS_TO_SCAN:
            |             if not os.path.exists(directory):
            |                 print(f"Warning: Directory '{directory}' not found. Skipping.")
            |                 continue
            | 
            |             # Walk through the directory
            |             for root, _, files in os.walk(directory):
            |                 # Ensure we strictly follow the structure (don't go too deep if not needed)
            |                 # This check ensures we don't scan sub-sub-folders if you only want the immediate children
            |                 # Remove the following 2 lines if you want recursive scanning everywhere
            |                 if directory == "." and root != ".": continue 
            |                 if directory != "." and root != directory: continue
            | 
            |                 for file in sorted(files):
            |                     if should_include(file):
            |                         filepath = os.path.join(root, file)
            |                         
            |                         # Normalize path for readability (remove ./ prefix)
            |                         clean_path = os.path.normpath(filepath)
            |                         
            |                         try:
            |                             with open(filepath, "r", encoding="utf-8", errors="replace") as f:
            |                                 content = f.read()
            |                                 
            |                             manifest.write(f"FILE_START: {clean_path}\n")
            |                             manifest.write("-" * 40 + "\n")
            |                             manifest.write(content)
            |                             if not content.endswith("\n"):
            |                                 manifest.write("\n")
            |                             manifest.write("-" * 40 + "\n")
            |                             manifest.write(f"FILE_END: {clean_path}\n\n")
            |                             
            |                             print(f"Added: {clean_path}")
            |                             total_files += 1
            |                         except Exception as e:
            |                             print(f"Error reading {clean_path}: {e}")
            | 
            |     print(f"\nSuccess! Scanned {total_files} files into {OUTPUT_FILE}.")
            | 
            | if __name__ == "__main__":
            |     generate_manifest()
            --- CONTENT END ---
        organize_project.py
            --- CONTENT START ---
            | import os
            | import shutil
            | import datetime
            | from pathlib import Path
            | import sys
            | 
            | def find_api_key():
            |     """
            |     Intelligently hunts for the API key by walking up the directory tree.
            |     Works regardless of where this script is run from.
            |     """
            |     import os, sys
            |     filename = 'Gemini_API.key.txt'
            |     current_search_dir = os.path.dirname(os.path.abspath(__file__))
            |     
            |     # Walk up the tree (max 5 levels) to find the 'assets' folder
            |     for _ in range(5):
            |         potential_key = os.path.join(current_search_dir, 'assets', filename)
            |         if os.path.exists(potential_key):
            |             return potential_key
            |         
            |         # Move up one level
            |         parent_dir = os.path.dirname(current_search_dir)
            |         if parent_dir == current_search_dir: # We hit the root
            |             break
            |         current_search_dir = parent_dir
            |     
            |     # Fallback: Check Desktop
            |     desktop_fallback = os.path.expanduser('~/Desktop/Gemini_API.key.txt')
            |     if os.path.exists(desktop_fallback):
            |         return desktop_fallback
            | 
            |     print("❌ CRITICAL ERROR: Could not find 'Gemini_API.key.txt' anywhere.")
            |     sys.exit(1)
            | 
            | 
            | # --- Configuration Constants ---
            | # Use pathlib for robust and platform-agnostic path manipulation
            | PROJECT_ROOT: Path = Path.cwd()
            | 
            | # Define paths for key files and directories relative to PROJECT_ROOT
            | API_KEY_FILE: Path = Path(find_api_key())
            | 
            | # Archive directory will be timestamped to prevent overwriting across runs
            | ARCHIVE_DIR_NAME: str = 'ARCHIVE_' + datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
            | ARCHIVE_DIR: Path = PROJECT_ROOT / ARCHIVE_DIR_NAME
            | 
            | # Define the desired new project structure directories (names only)
            | # These will be created directly under PROJECT_ROOT if they don't exist.
            | NEW_STRUCTURE_DIRS: set[str] = {
            |     'scripts',
            |     'assets',
            |     'config',
            |     'docs',
            |     'build',
            |     'tests',  # Common directory for test files
            |     'src',    # Common directory for core source code
            |     'logs'    # Common directory for log files
            | }
            | 
            | # Mapping of file extensions to their target directories within the new structure
            | # The keys are file extensions (including the dot), values are directory names.
            | FILE_EXTENSION_MAP: dict[str, str] = {
            |     '.sh': 'scripts',
            |     '.py': 'scripts',
            |     '.rb': 'scripts',
            |     '.js': 'scripts',
            |     '.ps1': 'scripts',
            |     '.bat': 'scripts',
            |     '.exe': 'build',  # Executables often belong to build artifacts
            |     '.dll': 'build',
            |     '.so': 'build',
            |     '.bin': 'build',
            |     '.apk': 'build',
            |     '.ipa': 'build',
            |     '.deb': 'build',
            |     '.rpm': 'build',
            |     '.dmg': 'build',
            |     '.iso': 'build',
            |     '.zip': 'build',
            |     '.tar.gz': 'build', # Specific handling for double extensions
            |     '.tgz': 'build',
            |     '.tar.xz': 'build',
            |     '.txz': 'build',
            |     '.tar.bz2': 'build',
            |     '.tbz2': 'build',
            |     '.rar': 'build',
            |     '.7z': 'build',
            | 
            |     '.png': 'assets',
            |     '.jpg': 'assets',
            |     '.jpeg': 'assets',
            |     '.gif': 'assets',
            |     '.svg': 'assets',
            |     '.ico': 'assets',
            |     '.webp': 'assets',
            |     '.mp3': 'assets',
            |     '.wav': 'assets',
            |     '.mp4': 'assets',
            |     '.avi': 'assets',
            |     '.mov': 'assets',
            | 
            |     '.yaml': 'config',
            |     '.yml': 'config',
            |     '.conf': 'config',
            |     '.json': 'config',
            |     '.ini': 'config',
            |     '.toml': 'config',
            |     '.env': 'config',
            | 
            |     '.md': 'docs',
            |     '.txt': 'docs',
            |     '.rst': 'docs',
            |     '.rtf': 'docs',
            |     '.pdf': 'docs',
            |     '.log': 'logs',
            | 
            |     '.test': 'tests',
            |     '.spec': 'tests',
            |     '.e2e': 'tests',
            | }
            | 
            | # Directories that os.walk should entirely skip scanning (absolute paths)
            | # This prevents scanning into version control, virtual environments, caches,
            | # and the archive directory itself, as well as the target structure directories
            | # where files are meant to end up.
            | IGNORED_DIRS_TO_SKIP: set[Path] = {
            |     PROJECT_ROOT / '.git',
            |     PROJECT_ROOT / '__pycache__',
            |     PROJECT_ROOT / '.venv',
            |     PROJECT_ROOT / 'venv',
            |     ARCHIVE_DIR,
            | }
            | # Add all target NEW_STRUCTURE_DIRS to the ignored list to prevent moving
            | # files that are already correctly placed within these directories.
            | for d_name in NEW_STRUCTURE_DIRS:
            |     IGNORED_DIRS_TO_SKIP.add(PROJECT_ROOT / d_name)
            | 
            | # --- Helper Functions ---
            | 
            | def verify_api_key(api_key_path: Path) -> bool:
            |     """
            |     Verifies if the API key file exists and contains a non-empty key.
            | 
            |     Args:
            |         api_key_path: The absolute path to the API key file.
            | 
            |     Returns:
            |         True if the API key is found and not empty, False otherwise.
            |     """
            |     try:
            |         if not api_key_path.is_file():
            |             print(f"⚠️  API Key file not found at {api_key_path}.")
            |             return False
            | 
            |         key = api_key_path.read_text(encoding='utf-8').strip()
            |         if not key:
            |             print(f"⚠️  API Key file at {api_key_path} is empty.")
            |             return False
            | 
            |         print(f"✅ API Key found in {api_key_path.name}.")
            |         return True
            |     except IOError as e:
            |         print(f"❌ Error reading API Key file {api_key_path}: {e}")
            |         return False
            |     except Exception as e:
            |         print(f"❌ An unexpected error occurred while verifying API key: {e}")
            |         return False
            | 
            | def suggest_moves(root_dir: Path, ignored_paths: set[Path]) -> list[tuple[Path, Path]]:
            |     """
            |     Scans the project directory and suggests files to move based on their extension.
            | 
            |     Args:
            |         root_dir: The root directory of the project to scan.
            |         ignored_paths: A set of absolute paths to directories that should be skipped
            |                        during the scan.
            | 
            |     Returns:
            |         A list of tuples, where each tuple contains (old_file_path, target_directory_path).
            |     """
            |     print(f"\nScanning '{root_dir.name}' for organization suggestions...")
            |     suggested_moves: list[tuple[Path, Path]] = []
            |     # Get the name of this script file to avoid suggesting moving itself
            |     current_script_name = Path(__file__).name
            | 
            |     for current_walk_root, dirs, files in os.walk(root_dir):
            |         current_path_obj = Path(current_walk_root)
            | 
            |         # Optimize os.walk: modify 'dirs' in place to prevent descending into ignored directories
            |         # Create a new list for dirs to allow modification
            |         dirs_to_process = []
            |         for d in dirs:
            |             dir_path = current_path_obj / d
            |             # If the directory or any of its parent is in the ignored list, skip it
            |             if any(dir_path.is_relative_to(ignored_p) for ignored_p in ignored_paths):
            |                 print(f"   Skipping directory: {dir_path.relative_to(PROJECT_ROOT)}")
            |                 continue
            |             dirs_to_process.append(d)
            |         dirs[:] = dirs_to_process # Update dirs in-place for os.walk
            | 
            |         # If the current_path_obj itself is an ignored path, skip processing its files
            |         if any(current_path_obj.is_relative_to(ignored_p) for ignored_p in ignored_paths):
            |             continue
            | 
            |         for file_name in files:
            |             # Skip this script file
            |             if file_name == current_script_name:
            |                 continue
            | 
            |             file_path = current_path_obj / file_name
            |             suffix = file_path.suffix.lower()  # e.g., '.py', '.txt'
            | 
            |             # Handle common double extensions like .tar.gz
            |             full_ext = suffix
            |             if suffix in ['.gz', '.bz2', '.xz'] and file_path.stem.lower().endswith('.tar'):
            |                 full_ext = '.tar' + suffix
            |             elif suffix == '.tgz': # Often an alias for .tar.gz
            |                 full_ext = '.tar.gz'
            |             elif suffix == '.tbz2': # Often an alias for .tar.bz2
            |                 full_ext = '.tar.bz2'
            |             elif suffix == '.txz': # Often an alias for .tar.xz
            |                 full_ext = '.tar.xz'
            | 
            | 
            |             target_dir_name = FILE_EXTENSION_MAP.get(full_ext)
            | 
            |             if target_dir_name:
            |                 target_dir_path = root_dir / target_dir_name
            | 
            |                 # If the file's current parent directory is NOT its target directory,
            |                 # then suggest a move. This covers files in the root or in incorrect subfolders.
            |                 if file_path.parent != target_dir_path:
            |                     suggested_moves.append((file_path, target_dir_path))
            |     return suggested_moves
            | 
            | def execute_organization(moves: list[tuple[Path, Path]], archive_path: Path, new_structure_folder_names: set[str]) -> None:
            |     """
            |     Executes the suggested file moves, creating new directories and archiving old versions.
            | 
            |     Args:
            |         moves: A list of (source_path, target_directory_path) tuples.
            |         archive_path: The absolute path to the directory where old versions will be archived.
            |         new_structure_folder_names: A set of directory names (strings) to create
            |                                     at the project root.
            |     """
            |     if not moves:
            |         print("Everything looks organized already, no files to move!")
            |         return
            | 
            |     print(f"\nPreparing to organize {len(moves)} files.")
            | 
            |     # 1. Create the archive directory if it doesn't exist
            |     try:
            |         archive_path.mkdir(parents=True, exist_ok=True)
            |         print(f"📁 Ensured archive directory exists: {archive_path.relative_to(PROJECT_ROOT)}")
            |     except OSError as e:
            |         print(f"❌ Error creating archive directory {archive_path}: {e}")
            |         sys.exit(1)  # Critical error, cannot proceed without archive
            | 
            |     # 2. Create new structure folders at the project root
            |     for folder_name in new_structure_folder_names:
            |         target_folder_path = PROJECT_ROOT / folder_name
            |         try:
            |             target_folder_path.mkdir(parents=True, exist_ok=True)
            |             print(f"📁 Ensured target directory exists: {target_folder_path.relative_to(PROJECT_ROOT)}")
            |         except OSError as e:
            |             print(f"❌ Error creating new structure directory {target_folder_path}: {e}")
            |             # Non-critical, but report and continue
            |             pass
            | 
            |     print(f"\nPlanning to move {len(moves)} files.")
            |     print("WARNING: Files with the same name in the target directory will be backed up.")
            |     confirm = input("Type 'yes' to proceed with moving files: ").lower()
            | 
            |     if confirm == 'yes':
            |         success_count = 0
            |         failure_count = 0
            |         for old_path, target_dir_path in moves:
            |             file_name = old_path.name
            |             new_path = target_dir_path / file_name
            | 
            |             try:
            |                 # If a file with the same name already exists at the new target,
            |                 # move it to the archive with a unique timestamped backup name.
            |                 if new_path.is_file():
            |                     # Generate a timestamp with milliseconds for unique backups
            |                     timestamp = datetime.datetime.now().strftime("%Y%m%d%H%M%S%f")[:-3]
            |                     backup_name = f"{file_name}.{timestamp}.bak"
            |                     backup_path = archive_path / backup_name
            |                     shutil.move(new_path, backup_path)
            |                     print(f"   Backed up existing '{new_path.name}' to '{backup_path.relative_to(PROJECT_ROOT)}'")
            | 
            |                 # Now move the original file from its old location to the new organized location
            |                 shutil.move(old_path, new_path)
            |                 print(f"✅ Moved: '{old_path.relative_to(PROJECT_ROOT)}' -> '{new_path.relative_to(PROJECT_ROOT)}'")
            |                 success_count += 1
            |             except FileNotFoundError:
            |                 print(f"❌ Failed to move '{old_path.relative_to(PROJECT_ROOT)}': Source file not found.")
            |                 failure_count += 1
            |             except PermissionError:
            |                 print(f"❌ Failed to move '{old_path.relative_to(PROJECT_ROOT)}': Permission denied.")
            |                 failure_count += 1
            |             except shutil.Error as e:
            |                 print(f"❌ Failed to move '{old_path.relative_to(PROJECT_ROOT)}': {e}")
            |                 failure_count += 1
            |             except Exception as e:
            |                 print(f"❌ An unexpected error occurred while moving '{old_path.relative_to(PROJECT_ROOT)}': {e}")
            |                 failure_count += 1
            | 
            |         print(f"\n--- Organization Summary ---")
            |         print(f"✅ Successfully moved {success_count} files.")
            |         if failure_count > 0:
            |             print(f"❌ Failed to move {failure_count} files. Please check the logs above.")
            |         print("✅ Organization process complete.")
            |     else:
            |         print("❌ Organization aborted by user.")
            | 
            | # --- Main Execution ---
            | 
            | if __name__ == "__main__":
            |     print(f"Starting FurryOS Project Organizer in '{PROJECT_ROOT}'")
            | 
            |     # Verify API key. This step is independent of file organization,
            |     # so execution continues even if the key is missing/invalid.
            |     verify_api_key(API_KEY_FILE)
            | 
            |     # Suggest file moves based on defined rules and ignored directories
            |     moves_to_perform = suggest_moves(PROJECT_ROOT, IGNORED_DIRS_TO_SKIP)
            | 
            |     # Execute the organization if any moves are suggested
            |     execute_organization(moves_to_perform, ARCHIVE_DIR, NEW_STRUCTURE_DIRS)
            --- CONTENT END ---
    [DIR] assets/
        heartbeat_core.c
        Makefile_optimized
        icon.png
        heartbeat_core_asm.s
        computer.png
        AnthroHeart_Trinity.png
        Gemini_API.key.txt
        Cio as Anthro.png
        healer_core.cpp
        [DIR] icons/
        [DIR] wallpapers/
            07 BlueHeart as Founder Lover.png
            13 Female Fox (Inner Circle).png
            09 12D Lyran Lion (Triad).png
            08 BlueHeart as Cio Lover.png
            02 Anthro Angel.png
            01 Divine Anthro.png
            anthroheart_collage.png
            Cio Anthro Fox Cub.png
            15 Cio as Anthro.png
            05 White Wolf as Lover.png
            14 9D Lyran Cat (Triad).png
            wallpaper.png
            01 Divine Anthro_upscayl_2x_realesrgan-x4plus.png
            03 AnthroHeart Trinity.png
            06 Native Dingo (Triad).png
            12 Female Dog (Inner Circle).png
            16 Master Tempter as Lover (Redeemed Shadow).png
            11 Male Dog (Inner Circle).png
            Anubis and Me.jpg
            Anthro Q.jpg
            Anthro Q2.jpg
            04 Cio as Founder.png
            10 Male Fox (Inner Circle).png
        [DIR] images/
            [DIR] AnthroHeart Saga/
                07 BlueHeart as Founder Lover.png
                13 Female Fox (Inner Circle).png
                09 12D Lyran Lion (Triad).png
                08 BlueHeart as Cio Lover.png
                02 Anthro Angel.png
                01 Divine Anthro.png
                anthroheart_collage.png
                15 Cio as Anthro.png
                05 White Wolf as Lover.png
                14 9D Lyran Cat (Triad).png
                01 Divine Anthro_upscayl_2x_realesrgan-x4plus.png
                03 AnthroHeart Trinity.png
                06 Native Dingo (Triad).png
                12 Female Dog (Inner Circle).png
                16 Master Tempter as Lover (Redeemed Shadow).png
                11 Male Dog (Inner Circle).png
                README.txt
                04 Cio as Founder.png
                10 Male Fox (Inner Circle).png
                [DIR] Me/
                    Cio Anthro Fox Cub.png
                    Anubis and Me.jpg
                    Cio Human.jpg
                [DIR] Anthro Q/
                    Anthro Q Questions.txt
                    Anthro Q.jpg
                    Anthro Poop Plush (Not Public Domain).jpg
                    Anthro Q2.jpg
            [DIR] The Warlock Name/
                warlock_collage.png
                Tree_Fox.png
                [DIR] Items/
                    Amulet.png
                [DIR] Characters/
                    4 Timekeeper Final.png
                    2 Magistro Final.png
                    5 Timewatcher Final.png
                    6 Xanther Final.png
                    3 Power Final.png
                    1 Warlock Final.png
                    7 Rezaeith Final.png
                    8 Hananni Final.png
                [DIR] Covers/
                    Warlock Cover Front.jpg
                    Tree Large Back.png
        [DIR] splash/
            background.png
        [DIR] sounds/
            startup.mp3
            [DIR] The AnthroHeart Music Library - Part 3/
                Draco’s Fall, Lyra’s Call.mp3
                Hashin' My Intentions.mp3
                Gummi Bears Lyrics in Sanskrit.mp3
                Dust of Dracos, Lyra’s Rise.mp3
                Divine Anthro my Lover.mp3
                Dreams on the Horizon.mp3
                Foxy in Love.mp3
                Eternal Anthro Bliss.mp3
                Divine Devotion.mp3
                Divine Anthro's Embrace (1).mp3
                Electric Dreams.mp3
                Divine Anthro at Heart.mp3
                Divine Heartbeat Extended.mp3
                Echoes of Your Love.mp3
                Frisky Guide to Moksha.mp3
                Hymn to Divine Anthro.mp3
                Divine Anthro’s Ska Love.mp3
                Heart of the Pack.mp3
                Divine Anthro's Embrace.mp3
                Escape to AnthroHeart.mp3
                Funny Furry Fun.mp3
                Forever Sunshine.mp3
                Gabriel’s Anthem for SweetHeart.mp3
                From Shadows to Starlight.mp3
                Groove Tonight.mp3
                Electric Visions of Moksha.mp3
            [DIR] The AnthroHeart Music Library - Part 2/
                Cosmic Chillout.mp3
                Cosmic Ambient.mp3
                Cosmic Ambient (1).mp3
                Becoming an Anthro Fox.mp3
                Divine Anthro Ska.mp3
                Divine Anthro Dreamstep.mp3
                Divine Anthro Relaxing.mp3
                Calm & Giggling.mp3
                Divine Anthro Deep Country.mp3
                Cosmic Ambient with Tribal Undertones.mp3
                Cosmic Ambient with Tribal Undertones (1).mp3
                Beyond Heaven, A Glimmering Drift.mp3
                Divine Anthro - Every Scent of You.mp3
                Cinematic Ambient 1.mp3
                Cosmic Ambient Electronica (1).mp3
                Cosmic Ambient Electronica.mp3
                BlueHeart My Divine Anthro Lover (Ambient Sky Drift).mp3
                Divine Anthro Rock.mp3
                Divine Anthro Disco 1.mp3
                BlueHeart My Divine Anthro Lover (Sky Drift).mp3
                Divine Anthro Devotion.mp3
                Divine Anthro Shining Bright.mp3
                Dancing with Divine.mp3
                Divine Anthro Celtic.mp3
                Divine Anthro Love Rock.mp3
                Divine Anthro Love.mp3
            [DIR] The AnthroHeart Music Library - Part 5/
                The Silly Android Day (Remastered).mp3
                Starlit Scent of the Wolf’s Grace 2.mp3
                The Mirror Never Lies.mp3
                Starlit Scent of the Wolf’s Grace 3.mp3
                Whispers of the Foxes.mp3
                Walking with Anthros (1).mp3
                Welcome to Our World.mp3
                White Wolf Anthro Lover.mp3
                Synthwave Star Trek.mp3
                The Prodigal Anthro Canine’s Redemption.mp3
                Walking with Anthros.mp3
                Village of Wonders.mp3
                The Universe Unfolds.mp3
                Whiskey and Heartache.mp3
                Space Ambient 1.mp3
                The Forest Fox.mp3
                The Silly Android Day.mp3
                The Fox's Song (Remastered).mp3
                The Fox's Song (1).mp3
                The Gentle Breeze.mp3
                White Wolf Anthro Sniffing.mp3
                Through Fur and Forever.mp3
                Universe Unleashed.mp3
                The Fox's Song.mp3
                Triad of Love_ Healing AnthroHeart.mp3
                What Do Anthros Smell Like_.mp3
                Steps to Devotion.mp3
                The Honor of the Sniff.mp3
                The Anthro Angel’s Forge.mp3
            [DIR] The AnthroHeart Music Library - Part 1/
                Ambient Electronic with Ethereal Undertones (3).mp3
                Anthro Love.mp3
                Ambient Folk with a Touch of Ethereal Fantasy.mp3
                Anthro Magic.mp3
                Ambient New Age with Ethereal Tribal Elements (2).mp3
                AnthroHeart’s Call.mp3
                Ambient Electronic with Ethereal Undertones (2).mp3
                Anthro Love Song.mp3
                Ambient Electronic.mp3
                A Gift Like No Other.mp3
                Ambient Electronic with Ethereal Undertones (1).mp3
                Anthros they light up my world (Remastered).mp3
                Anthros, they light up my world.mp3
                Ambient New Age with Ethereal Tribal Elements.mp3
                AnthroHṛdaya Stotram (Hymn of  AnthroHeart).mp3
                Anthro Dreams.mp3
                Anthro Wonderland.mp3
                Anthros Divine.mp3
                Ambient New Age with Ethereal Tribal Elements (1).mp3
                Ambient Folk with a Touch of Ethereal Fantasy (2).mp3
                Anthro Fantasia.mp3
                Ambient Electronic with Ethereal Undertones.mp3
                Anthros, they light up my world (1).mp3
                Anthro Dreams & Poop Plush Schemes.mp3
                Ambient Electronic (1).mp3
                Anthro Dreams (1).mp3
                Ambient Folk with a Touch of Ethereal Fantasy (1).mp3
            [DIR] The AnthroHeart Music Library - Part 4/
                I Love and Worship You, Anthros.mp3
                Rise to the Stars_ Ascension’s Call.mp3
                My Dog Used up All His Pee.mp3
                Moksha Fox Dreams.mp3
                Sacred Divine Anthro.mp3
                Paper Towels (Remastered).mp3
                In the Stillness of the Night.mp3
                Jesus Has a Universe 🕊️🐾.mp3
                Moksha Anthro Fox.mp3
                Push Desire to 11.mp3
                Midnight Serenity.mp3
                Sky in My Head.mp3
                Space Ambient 1 (1).mp3
                Rising Shadows.mp3
                Love in Moksha.mp3
                Sky in My Head (1).mp3
                Moksha Fox.mp3
                Skyline Devotion.mp3
                Octave Stand_ Michael’s Might 2.mp3
                Sniffing Through the Stars.mp3
                Paper Towels.mp3
                Lyra’s Refuge (SweetHeart’s Gift).mp3
                Solving 8 Paradoxes.mp3
                Master Tempter’s Redemption_ Forever Mine.mp3
                Nudist Starship Diplomacy.mp3
                Run to Redemption (Master Tempter’s Anthem).mp3
                Om Shri Divine Anthro Namah.mp3
                Relief in Your Light.mp3
                Onward We Go.mp3
                In Your Anthro Fur I Find Forever.mp3
            [DIR] The AnthroHeart Music Library - Part 6/
                ambient cosmic folk with a tribal twist (2).mp3
                ambient fantasy.mp3
                ambient cosmic folk with a tribal twist.mp3
                Wholesome.mp3
                You Got Me, Anthro Q!.mp3
                White Wolf Lover.mp3
                ambient fantasy (1).mp3
                ambient cosmic folk with a tribal twist (1).mp3
                ✨ _Anthro Beyond the Veil_ ✨.mp3
    [DIR] build_system/
        core_logic.asm
        Makefile
        install_toolchain.sh
            --- CONTENT START ---
            | #!/bin/bash
            | # Installs the Cross-Compiler toolchain on Debian 13
            | echo "🔧 Installing MinGW-w64 (Windows Compiler) and NASM (Assembler)..."
            | sudo apt-get update
            | sudo apt-get install -y build-essential mingw-w64 nasm make
            | echo "✅ Toolchain installed."
            --- CONTENT END ---
        wrapper_cli.c
        core_logic.o
        wrapper_gui.c
        core_logic.obj
        [DIR] bin/
            app_gui.exe
            app_console.exe
            app_linux
    [DIR] docs/
        SIGNING_GUIDE.md.txt
        BALENAETCHER_UPDATE_SUMMARY.md.txt
        MIT_LICENSE.txt
        TIMESTAMP.txt
        COMPILER_SPEC.md
        COMPLETE_ISO_SUMMARY.md.txt
        BUILD_OPTIONS.md.txt
        PEP668_FIX_GUIDE.md.txt
        PROGRESS_FEATURES.md.txt
        VERSION.txt
        ISO_README.txt
        INTEGRATION_COMPLETE.md.txt
        ETCHER_INCLUSION_GUIDE.md.txt
        PARTITION_CREATOR_GUIDE.md.txt
        MANIFEST.md
        C_ASSEMBLY_OPTIMIZATION.md.txt
        ASSEMBLY_OPTIMIZATION_PLAN.md.txt
        AFTER_DOWNLOAD_GUIDE.md.txt
        COMPLETE_FIX_GUIDE.md.txt
        ASSETS-README.md
        FurryOS_Complete_Documentation.pdf
        KERNEL-README.md
        FIX_SUMMARY.md.txt
        Gemini_API.key.txt
        SMART_PARTITION_GUIDE.md.txt
        USB_WRITING_GUIDE.md.txt
        VENV_GUIDE.md.txt
        ANTHROHEART_INCLUSION_GUIDE.md.txt
        README.md.txt
        PACKAGE_LIST.md.txt
        UPDATE_INSTRUCTIONS.md.txt
        BUILD_SUMMARY.md.txt
        txtfiles.py.txt
        UPDATE_SUMMARY.md.txt
        PROJECT-README.md
        PERSISTENCE_GUIDE.md.txt
        MANIFEST.txt
        FILE_ORGANIZATION.md.txt
        VERSION_REFERENCE.md.txt
        FRESH_BUILD_GUIDE.md.txt
        requirements.txt
        OVERVIEW.md
        MANIFEST.md.txt
        COPY_PASTE_COMMANDS.txt
        API_Key.txt
        [DIR] guides/
    [DIR] src/
        [DIR] tests/
        [DIR] kernel/
            initrd.img
            filesystem.squashfs
            vmlinuz

============================================================
DIAGNOSTIC SUMMARY:
❌ CRITICAL: Found 4 files still containing 'neofetch'. This WILL break the build.
